The result is systems that can produce text that is very compelling when we as humans make sense of it. But the systems do not have any understanding of what they are producing, any communicative intent, any model of the world, or any ability to be accountable for the truth of what they are saying.
…
When people seek information, we might think we have a question and we are looking for the answer, but more often than not, we benefit more from engaging in sense-making: refining our question, looking at possible answers, understanding the sources those answers come from and what perspectives they represent, etc. Consider the difference between the queries: “What is 70 degrees Fahrenheit in Celcius?” and “Given current COVID conditions and my own risk factors, what precautions should I be taking?”
Whether it was Gap Khakis, Patagonia vests, or Allbirds, the counter-cultural ethos that applauded individuality has been replaced by herd thinking. In Silicon Valley, we use a better marketing term for herd: team. One of the biggest trends of the past twenty years has been the rise of corporate swag. Wearing a Google t-shirt, an AirBnB backpack, or a logo-festooned Hydra bottle are all symbols of belonging to a herd called “work.” These logos advertised where you worked and thus gave you a place in Silicon Valley’s social hierarchy.
As the technology industry became the cultural zeitgeist, it became necessary to advertise to the world that you were part of the tech set. And the easiest way to do so is through uniforms. And I don’t mean uniform in the strictest sense, just as pinstripes and bold red suspenders were the look for traders and bankers in the heyday of Wall Street. By embracing a uniform, we are echoing being part of the tribe. Uniform is a great leveler, and it shows what team you are on. It is a symbol of power, affiliation, and hierarchy. Its underlying ethos: us versus them.
Musk uses each of the tactics that Trump did. But as Twitter’s owner, CEO and “chief twit,” he has an extra advantage that will make him an especially dangerous threat to democracy if we’re not careful.
Musk now has vast control over what we hear and see on this powerful media platform. (And despite his claims to be a champion of “free speech,” he is busy banning the speech of those with whom he does not like, such the “Elon Jet” account that uses public information to track his wasteful and environmentally damaging private jet flights.)
… a generative ML system could make lots more ‘disco’ music, and it could make punk if you described it specifically enough (again, prompt engineering), but it wouldn’t know it was time for a change and it wouldn’t know that punk would express that need. When can you ask for ‘something raw, fresh and angry that’s a radical change from prog rock?’ And when can a system know people might want that? There is some originality in creating new stuff that looks like the patterns we already have, but the originality that matters is in breaking the pattern. Can you score that?
As AI gets better and better at repeating patterns that exist. As we get better at producing prompts – what Evans calls ‘prompt engineering’. As AI starts getting integrated in tools in ways we don’t even notice. As all that happens, we and how do we identify the places we need human intervention? And not just to retain unique value or to increase the amount of interesting hard work we do, but so that we are not building a future based on our past. Because, you know what? That past is full of exploitation, extraction, and oppression. Many would argue the very AI works is an example. It’s intelligence is built off the work of others in a mass consumption way.
So, what does this mean for civil society? How do we take advantage of the technology, influence the system, and judge the output?
Algorithm detectives. We need an independent body who reviews the tools – algorithms, the training of AI – so that we can be clear about the patterns we are learning from and the ways we promote the responses.
Use AI to identify patterns of injustice. This will require excellent prompt engineers who are asking and asking and asking and then sharing the results.
Illuminate what is missing. If AI is trained on available massive data, we need to show what and who is missing from that data. And we have to find ways to include it. Sometimes that will come from technical means – refining, adding, training – and sometimes it will come from advocacy. It must be intentional.
Use the tools. We can’t just opt out. Usage shapes the tools. We have to use them and aggregate our learnings with the goal of improving our own efficiency and shaping the tools themselves. And let’s normalize it. Don’t hide that the you turned the grant question into an AI prompt.
Build context. That’s what struck me about the quote above: context is something’s humans have in a hyper local way. We can adjust, disregard, slow down or speed up what AI generates for us based on the context. We can identify places where the pattern is wrong for reasons of justice and equity. We have the context and the experience.
“Part of the job of making change is working to make sure a bad story doesn’t get in the way of good facts.” This is good advice from Seth Godin. And it is so hard. Especially when teams of well trained humans are well funded to make excellent stories to elicit actions and those stories are based on lies.
To really accomplish this we have to:
Prioritize access to data and the tools and humans to find the insights in the data. The Data Innovation Lab at Tech Impact is doing interesting work in this area.
Help people interrogate the stories put in front of them. This is some of the work IREX does.
A blockchain is a digital ledger associated with an asset, recording the history of transactions in that asset — who bought it from whom and so on. The asset could be a digital token like a Bitcoin, but it could also be a stock or even a physical thing like a shipping container. Ledgers, of course, are nothing new. What’s distinctive about blockchains is that the ledgers are supposed to be decentralized: They aren’t sitting on the computers of a single bank or other company; they’re in the public domain, sustained by protocols that induce many people to maintain records on many servers.
These protocols are, everyone tells me, extremely clever. I’ll take their word for it. The question I’ve never heard or seen satisfactorily answered, however, is, “What’s the point?” Why go to the trouble and expense of maintaining a ledger in many places, and basically carrying that ledger around every time a transaction takes place?
Krugman’s answer and the Hacker News comment consensus is: Nothing. At best, it’s a solution is search of a problem.
I don’t agree. I think it’s a solution in search of a user interface. The first people grokked, if it all, is bitcoin. Because, money. But that’s not all:
PolicyKit is a peer-to-peer community governance and moderation project.
Can these problems be solved in other ways? Of course. Is blockchain and interesting way to solve these: yes.
In a world where billionaires can disrupt social communications by buying a platform, war and its crimes continue to be perpetrated, and government records are removed from public view, decentralized tools using a ledger can help us maintain the past, understand the present, and collaborate into the future.
To improve the safety and effectiveness of AI, the first principle suggests that AI systems should be developed not only by experts, but also with direct input from the people and communities who will use and be affected by the systems. Exploited and marginalized communities are often left to deal with the consequences of AI systems without having much say in their development. Research has shown that direct and genuine community involvement in the development process is important for deploying technologies that have a positive and lasting impact on those communities.
This is a role that civil society can take; making sure the communities they serve are reflected in this data that is used to train AI and how that training plays out. The difficulty, of course, is how to facilitate that engagement.
The other thing that struck me was that the bean man had been making this Saturday crack-of-dawn drive, and usually another Sunday drive to another city market, for decades. He was one of the originals still at the market. He was an Eastern Shore truck farmer, he sold the only fresh beans around — cannellini, Navy beans, red beans, black beans, October beans, lima beans, Dixie butter beans, speckled beans, black-eyed peas — and in the spring, oh my goodness, he sold fresh peas. That is, he had something that the Baltimore of all colors and incomes loved, he could make a living at it, it was good to do, he did it, he never stopped.
One thing we all should’ve learned from the public hearings of the January 6 select committee is that almost nothing is spontaneous anymore. Dig deep enough, and you’ll find someone organizing these “spontaneous” events (as well as someone bankrolling them). In any case, Reuters found examples of this occasionally criminal behavior all over the country. Most of the incidents ran on the same rails.
When I’m not writing, I’m often surfing. Something most non-surfers don’t realize is that surfing is mostly waiting. You paddle out and then wait for the right wave to roll in. When a promising set rears up out of the deep, you try to catch it. If you hesitate, even for a moment, you’ll either miss the wave or, worse, get sucked over the falls as it breaks. To catch a wave, you have to fully commit.
I suspect the same principle is at work when writing about something changes your mind. The brain is an intricate, sparkling, densely interconnected maze—an easy place for ideas to hide in vague generalities. But writing forces you to commit to specifics as surely as surfers must commit to waves. Seeing an idea reveal itself on the page, you may find yourself entranced or repulsed or inspired by its specificity, its naked meaning.