• 0 Posts
  • 37 Comments
Joined 2 years ago
cake
Cake day: June 20th, 2023

help-circle







  • There’s a reason I specifically opened with how in CT it isn’t an issue before explaining that in the majority of the country (notice I said country vs the population) it still is. Like the CT governor you still seem to not quite grasp the reality of what it is like to live somewhere other than a built up urban area. There are no buses here, there are no trains here. If I wanted to rent a gas car, I need to drive 120 miles to the city because there isn’t a rental option in my town (which actually qualifies as a “city”. It’s an hour drive to the nearest movie theater. While NYC alone has more people than the entire state of OK, there are still millions of people living here that simply can’t get by with an EV for day to day lives, let alone if they want to make a trip by any transportation method. Add in the fact that even with current developments and proposals battery energy density is a hard limit of physics and chemistry, unless a completely new method of energy storage is invented it will always be 1/100th of what gasoline has meaning EVs will continue to be absurdly overweight. Don’t worry, I’m not in a rush to sell any of my ICE vehicles, at this point I might literally hold onto them forever because there isn’t a single car being made new right now that I like better than anything I currently own.







  • I would love an actual Razr that works, but foldables just seem like a cheap gimmick. I’m happy with my 13 pro, I wouldn’t want any bigger though. My wife on the other hand has no intentions of abandoning her Mini no matter what Apple offers in full size models to come. I do envy my co worker with a mini that can keep it in a sleeve pocket though.



  • It is about adding code. No dataset will be 100% free of undesirable results. No matter what marketing departments wish, AI isn’t anything close to human “intelligence,” it is just a function of learned correlations. When it comes to complex and sensitive topics, the difference between correlation and causation is huge and AI doesn’t distinguish. As a result, they absolutely hard code AI models to avoid certain correlations. Look at the “[character] doing 9/11” meme trend. At the fundamental level it is impossible to restrict undesirable outcomes by avoiding them in training models because there are an infinite combinations of innocent things that become sensitive when linked in nuanced ways. The only way to combat this is to manually delink certain concepts; they merely failed to predict it correctly for this specific instance.


  • I imagine they likely have hardcoded rules about associating content indexed as “terrorist” against a query for a nationality. Most mainstream AI models do have specific rules built in to prevent stuff like this, they just aren’t all encompassing and can still happen if there is sufficient influence from the training data.

    While FB does have content moderators, needing human verification of every single piece of AI generated defeats the purpose of AI. If people want AI there is a certain amount of non politically correct results that will slip through the cracks. The bottom line is content moderation as we know it has extreme biases applied to fit the safest “viewpoint model” and any system based on objective data analysis, especially with biased samples such as openly available internet, is going to get results that do not fit the standard “curated” viewpoint.