ai

New York Times
To create A.I. businesses, experts predicted, $1 trillion would be spent on data centers, utilities and applications. Mr. Covello thought those costs made it impossible for the industry to inexpensively solve real-world problems, which is what internet companies did decades ago.

As a member of Goldman’s working group on A.I., he reviewed a service that used generative A.I. to automatically update analysts’ spreadsheets with companies’ financial results. He said it saved his analysts about 20 minutes of time per company but cost six times as much money.
I think this is exactly it. AI is too expensive in terms of energy use and is highly subsidized by venture capital at the moment. The minute that money dries up there won't be enough of a payoff to justify the expense.
Tech Policy Press
It is not the case that “AI gathers data from the Web and learns from it.” The reality is that AI companies gather data and then optimize models to reproduce representations of that data for profit. It’s worth preserving distinctions between the process behind building software systems and the social investments that aim to cultivate the human mind.

...

The learning myth downplays the role of data in developing these systems, perpetuating a related myth that data is abundant, cheap, and labor-free. These myths drive down the value of data while hiding the work of those who shape, define, and label that data.
I want to quote this entire essay. So many great points about how generative AI is sold to the public to consider.
Crikey
Artificial intelligence is worse than humans in every way at summarising documents and might actually create additional work for people, a government trial of the technology has found.
WHAT?!?! (sarcasm)
The Verge
We briefly lived in an era in which the photograph was a shortcut to reality, to knowing things, to having a smoking gun. It was an extraordinarily useful tool for navigating the world around us. We are now leaping headfirst into a future in which reality is simply less knowable. The lost Library of Alexandria could have fit onto the microSD card in my Nintendo Switch, and yet the cutting edge of technology is a handheld telephone that spews lies as a fun little bonus feature.
Oh no. These examples are impossible to ID as AI.

Here are some more examples by Chris Welch at Threads.
the-decoder.com
Microsoft attempted to remove the false entries but only succeeded temporarily. They reappeared after a few days, SWR reports. The company's terms of service disclaim liability for generated responses.
It's almost like they're trying to say "generated responses" aren't the product. Every company should just add "we’re not responsible for the products we sell” to their TOS to avoid all liability. Genius!
404 Media
Generative AI could “distort collective understanding of socio-political reality or scientific consensus,” and in many cases is already doing that, according to a new research paper from Google, one of the biggest companies in the world building, deploying, and promoting generative AI.
Thanks for the warning while you keep destroying reality, I guess?
stackoverflow.blog
The present wave of generative AI tools has done a lot to help us generate lots of code, very fast. The easy parts are becoming even easier, at a truly remarkable pace. But it has not done a thing to aid in the work of managing, understanding, or operating that code. If anything, it has only made the hard jobs harder.
If AI starts doing the work of junior developers how will you get senior developers in the future?
doublepulsar.com
I think they are probably going to set fire to the entire Copilot brand due to how poorly this has been implemented and rolled out. It’s an act of self harm at Microsoft in the name of AI, and by proxy real customer harm.
AI has really obliterated the idea of getting consent from users. Big companies are just enabling data theft on a grand scale now. It's like people who build houses working for thieves rather than homeowners.
Medium
What is AI? In fact this is a marketing term. It’s a way to make certain kinds of automation sound sophisticated, powerful, or magical and as such it’s a way to dodge accountability by making the machines sound like autonomous thinking entities rather than tools that are created and used by people and companies.
Emily Bender has a great clarifying way of thinking about AI. I found her breakdown of the kinds of systems that are being called AI today very helpful.
helmut-schmidt.de
By narrating their products and services as the apex of “human progress” and “scientific advancement,” these companies and their boosters are extending their reach and control into nearly all sectors of life, across nearly every region on earth. Providing the infrastructure for governments, corporations, media, and militaries. They are selling the derivatives of the toxic surveillance business model as the product of scientific innovation.
Interesting talk by Meredith Whittaker (President of the Signal Foundation) looking at big tech's surveillance business model and how we might imagine a different way.
tomshardware.com
Ben continues in his thread, "[The moderator crackdown is] just a reminder that anything you post on any of these platforms can and will be used for profit. It's just a matter of time until all your messages on Discord, Twitter etc. are scraped, fed into a model and sold back to you."
These user conflicts highlight the way site owners extract monetary value from the community in ways that aren’t shared back. Now some 3rd party will be making money from their time and energy. So disappointing to see companies being bad stewards of good impulses like the desire to pitch in and help share knowledge.
404 Media
Google Books is indexing low quality, AI-generated books that will turn up in search results, and could possibly impact Google Ngram viewer, an important tool used by researchers to track language use throughout history.
AI is why we can’t have nice things. Also maybe having a private for profit company organize the world’s information was a terrible idea. They make decisions to maximize their profits, not provide a data heritage for humanity.
« Older posts