• 0 Posts
  • 59 Comments
Joined 1 year ago
cake
Cake day: July 19th, 2023

help-circle

















  • There’s nothing stopping you from going to youtube, listening to a bunch of hit country songs there, and using that inspiration to write a “hit country song about getting your balls caught in a screen door”. That music was free to access, and your ability to create derivative works is fully protected by copyright law.

    So if that’s what the AI is doing, then it would be fully legal if it was a person. The question courts are trying to figure out is if AI should be treated like people when it comes to “learning” and creating derivative works.

    I think there are good arguments to both sides of that issue. The big advantage of ruling against AI having those rights is that it means that record labels and other rights holders can get compensation for their content being used. The main disadvantage is that high cost barriers to training material will kill off open-source and small company AI, guaranteeing that generative AI is fully controlled by tech giant companies like Google, Microsoft, and Adobe.

    I think the best legal outcome is one that attempts to protect both: companies and individuals below a certain revenue threshold (or other scale metrics) can freely train on the open web, but are required to track what was used for training. As they grow, there will be different tiers where they’re required to start paying for the content their model was trained on. Obviously this solution needs a lot of work before being a viable option, but I think something similar to this is the best way to both have competition in the AI space and make sure people get compensated.


  • There’s not much concrete data I can find on accident rates on highways vs non-highways. You would expect small side streets accidents to have lower fatality rates though, with wrecks at highway speeds to have much higher fatality rates. From what I see, a government investigation into how safe autopilot is determined there were 13 deaths, which is very low number given the billions of miles driven with autopilot on (3 billion+ in 2020, probably 5-10billion now? Just guessing here since I can’t find a newer number).

    But yeah, there are so many factors with driving that it’s hard get an exact idea. Rural roads have the highest fatality rates (making up to 90% of accident fatalities in some states), and it’s not hard to image that Tesla’s are less popular in rural communities (although they seem to be pretty popular where I live).

    But also rural roads are a perfect use case for autopilot, generally easy driving conditions where most deaths happen due to speeding and the driver not paying attention. Increased adoption of self driving cars in rural communities would probably save a lot of lives.


  • It reminds me of the debate around self driving cars. Tesla has a flawed implementation of self driving tech, that’s trying to gather all the information it needs through camera inputs vs using multiple sensor types. This doesn’t always work, and has led to some questionable crashes where it definitely looks like a human driver could have avoided the crash.

    However, even with Tesla’s flawed self driving, They’re supposed to have far fewer wrecks than humans driving. According to Tesla’s safety report, Tesla’s in self driving mode average 5-6 million miles per accident vs 1-1.5 million miles for Tesla drivers not using self driving (US average is 500-750k miles per accident).

    So a system like this doesn’t have to be perfect to do a far better job than people can, but that doesn’t mean it won’t feel terrible for the unlucky people who things go poorly for.