When Did Privacy Become a Bad Word, Crypto Rights are American Rights, Apollo the Robot, & More
Future Essentials - Edition 50
Here’s what you’re getting in this edition:
10 articles from this week
When did privacy become a bad word?
Crypto rights are fundamental American rights
In the age of generative AI, it’s impossible to know where your information is going—or what it’s going to be used for.
A New Contract for Artists in the Age of Generative AI
Stephen King: My Books Were Used to Train AI
Meet Apollo, the ‘iPhone’ of humanoid robots
Is this tiny house that costs $37,600 the solution to the housing crisis?
A meditation for mental control
Question for reflection
10 articles from this week
Shopify enables USDC payments via Solana Pay integration - Blockworks
Coinbase Takes Stake in Stablecoin Issuer Circle - Bloomberg
Friend.tech: Flash in the pan or inevitable future of crypto financialization? - Blockworks
Worldcoin is not what we meant by making crypto mainstream - Blockworks
Alibaba releases two new models in the A.I. arms race | Fortune
Nvidia Chip Shortages Leave AI Startups Scrambling for Computing Power | WIRED
Google and YouTube are trying to have it both ways with AI and copyright - The Verge
When did privacy become a bad word?
The U.S. Department of Justice’s (DOJ) indictment against the Tornado Cash developers, filed Wednesday, is commensurate with the government’s seeming disdain for privacy.
Across the government, there seems to be an entrenched assumption that an individual’s desire to keep the details of their life private means they’re engaging in wrongdoing. This overly simplistic assumption is not supported by the law or the reality of exactly why privacy is so important to countless law-abiding citizens in their everyday lives.
Nor does it adequately balance the 21st century citizen’s right to privacy with the need to ensure the government can effectively enforce the law. LINK
Crypto rights are fundamental American rights
Contrary to crypto skeptics’ views, the merits of decentralization and personal autonomy align with fundamental American rights: free speech, privacy and due process.
It is easy to look at new technology with skepticism, especially when bad actors have capitalized on vulnerabilities within these emerging systems. However, wrongdoers exist in every industry, every technological sector, and are quite adept at finding holes that the good actors haven’t yet filled.
Crypto’s unique benefits play a role in strengthening core American rights. We urge crypto’s antagonists to take a closer look at the technology’s role in enhancing — not undermining — some of our vital Constitutional provisions. LINK
In the age of generative AI, it’s impossible to know where your information is going—or what it’s going to be used for.
IN 2010, MARK Zuckerberg told the audience at a TechCrunch awards ceremony that young people—especially social media users—no longer cared about privacy. “People have really gotten comfortable not only sharing more information and different kinds, but more openly and with more people,” he said. “That social norm is just something that has evolved over time.” While this statement obviously hasn’t aged well, it reflects a common belief that privacy violations happen when individuals reveal their own information. In other words, when something posted to Reddit or TikTok goes viral, or a nude photo sent to an admirer leaks, it’s first and foremost the fault of the person who posted it. This model of individualized accountability is very persistent. It’s also completely wrong. And it’s irrelevant in the age of generative AI.
Generative AI completely obliterates the idea of individual responsibility for privacy because you can’t control these algorithms’ access to your information, or what they do with it. Tools like ChatGPT, Dall-E, and Google Bard are trained on data scraped without consent, or even notice. At their worst, training sets suck up vast amounts of digital information and combine it into a data slurry that serves as the raw material for generative AI. As tech companies are scrambling to incorporate generative AI into every imaginable product, from search engines to games to military gadgets, it’s impossible to know where this output is going, or how it might be interpreted. Their privacy-violating predecessors, data brokers, also scraped the web and assembled massive dossiers on individuals, but their outputs aren’t available to the average person, for free, or integrated into search engines and word processors. The widespread availability of generative AI compounds potential privacy violations and opens up more people to harmful consequences. LINK
A New Contract for Artists in the Age of Generative AI
Generative AI has followed a familiar technological hype cycle. It has promised a transformation of how artists work, while threatening to end the careers of other creatives, such as those working in stock photography and copywriting. It could further disrupt, if not eliminate, creatives dependent upon commissioned illustration and concept art, video game art, and even voice-over artists.
The underlying driver of this shift is hard to grapple with. It doesn’t derive from what these models produce, but what they produce from: vast streams of creative writing, photographs, and drawings shared online. The authors of these works find themselves grappling with the reality that they have co-created a massive corpus of training data for the very AI platforms that would undermine their professions. To many, it feels as if their creative fruits have been harvested and blended by an algorithmic juicing machine without warning or consent. Where labor power is strong, such as screenwriters and actors, strikes by creative professionals are centering these concerns. But this isn’t an option for independent, small-scale creatives who depend on the Internet to find clients and community. LINK
Stephen King: My Books Were Used to Train AI
Self-driving cars. Saucer-shaped vacuum cleaners that skitter hither and yon (only occasionally getting stuck in corners). Phones that tell you where you are and how to get to the next place. We live with all of these things, and in some cases—the smartphone is the best example—can’t live without them, or so we tell ourselves. But can a machine that reads learn to write?
I have said in one of my few forays into nonfiction (On Writing) that you can’t learn to write unless you’re a reader, and unless you read a lot. AI programmers have apparently taken this advice to heart. Because the capacity of computer memory is so large—everything I ever wrote could fit on one thumb drive, a fact that never ceases to blow my mind—these programmers can dump thousands of books into state-of-the-art digital blenders. Including, it seems, mine. The real question is whether you get a sum that’s greater than the parts, when you pour back out. LINK
Meet Apollo, the ‘iPhone’ of humanoid robots
Humanoids that handle household chores or build habitats on the lunar surface may sound like something from science fiction. But the team at Austin-based robotics startup Apptronik envisions a future where general-purpose robots will handle “dull, dirty and dangerous” jobs so humans don’t have to.
The design for Apptronik’s latest humanoid robot, named Apollo, was unveiled on Wednesday.
The robot is on the same scale as a human being, standing at 5 feet, 8 inches (1.7 meters) tall and weighing 160 pounds (72.6 kilograms).
Apollo can lift 55 pounds (25 kilograms) and has been designed to be mass-produced and safely work alongside humans. The robot utilizes electricity, rather than hydraulics that aren’t considered to be as safe, and has a four-hour battery that can be changed out so it can operate for a 22-hour workday. LINK
Is this tiny house that costs $37,600 the solution to the housing crisis?
In a new neighborhood of 3D-printed homes in Georgetown, Texas, a 1,574-square-foot house starts at $475,000. In Japan, a construction startup called Serendix recently 3D-printed a smaller house at a fraction of the cost: 5.5 million yen (around $37,600). If it’s built on a small lot in one of Japan’s smaller cities, the startup says, land adds relatively little to the total cost.
“The house of the future costs as much as a car,” Serendix CEO Kunihiro Handa said in an email.
Like other companies with 3D-printing tech for homes, the startup prints layers of concrete to build walls. The Fujitsubo house is designed to be printed in sections that are attached to the foundation with steel columns. The panels on the roof were made on a CNC machine. The whole construction process took 44 hours and 30 minutes, the company says. The $37,600 price includes finished rooms inside. LINK
A meditation for mental control
Question for reflection
How can you reframe any challenges you may be facing from stumbling blocks into steppingstones?
What new skills and insights might you gain as you navigate these opportunities for growth?