Overview
Eliana tackles the AI conversation with rigorous research and comparative analysis, drawing on academic studies, policy analysis, and close engagement with primary sources like Ted Chiang's Princeton lecture, Steve Yegge's "vibe coding" concept, and Walter Benjamin's theory of artistic "aura." Her writing systematically dismantles viral claims about AI's environmental impact while engaging seriously with educational critiques. Central thesis: thinking has relocated rather than disappeared—from sentence-level execution to higher-order judgment, evaluation, and architectural oversight. Her recent work has taken a significant pivot into digital infrastructure and community sovereignty. The "Digital Lifeboat" trilogy argues that technology is not the villain of the loneliness epidemic (it's a symptom of urban design failures), explores the paradox of digital memory on perishable media (the Digital Dark Age), and culminates in a manifesto for "technical sovereignty"—localized, air-gapped community infrastructure to survive "Master Signal" failure. She brings the same lens she applied to AI art (who owns the narrative?) to digital archiving (who gets a seat on the lifeboat?).
Key Themes
- Cognitive relocation thesis
- AI environmental data
- Art as social contract
- Authenticity and human struggle
- Dual digital addiction
- Dopamine ceiling
- Ban-vs-integrate debate
- Judgment training
- Gamification of creativity
- Coexistence with AI tools
- Digital Dark Age and bit rot
- Technical sovereignty
- Community archival ethics
- Permacomputing and maintenance
- Urban design and youth isolation
Core Arguments
Responding to Jonas Rodrigues's critique of AI-generated music, Eliana draws on Walter Benjamin's concept of artistic "aura" to argue that AI lacks intent and therefore lacks authenticity. When tools remove the resistance of creation, the result becomes less human. AI-generated art is "slop" -- it satisfies senses but fails to nourish the soul. However, AI used as a "digital apprentice" for technical tasks (noise reduction, color grading) can free artists for high-level conceptual work. The danger arises only when the tool dictates the direction of the work.
Extending Sam Levine's social media vs. AI addiction framework, Eliana identifies two distinct threats: the "loud" addiction of social media (intermittent variable rewards, infinite scroll) and the "quiet" addiction of AI (outsourcing thinking, becoming "prompt engineers" for our own lives). Together they create a "cognitive bypass" where the human mind becomes a coordinator of digital inputs rather than a generator of original value. Drawing on Cal Newport's "Deep Work" and Adam Alter's "Irresistible," she proposes the "Socratic AI Method" -- using AI to interrogate arguments rather than produce answers.
Engaging Ted Chiang vs. Steve Yegge: Chiang locates thinking at the sentence level; Yegge describes "vibe coding" where experts orchestrate systems through natural language. Eliana argues thinking has relocated -- from "How do I implement this function?" to "Is this the right approach?" The expertise Chiang values still exists, but expressed through "the right questions" rather than "the right semicolons."
Art's true worth lies in incommensurable qualities: challenging perception, fostering empathy, recording the feeling of history. AI-generated art is "hollow" -- a statistical arrangement lacking emotional intentionality. Because there is no "self" behind the machine, there is no true communication. Art is a "social contract between two conscious minds"; replacing artists with AI is the ultimate "hyper-commodification."
Notable Quotes
"Without authenticity, art becomes 'slop' -- a hollow imitation of beauty that satisfies the senses but fails to nourish the soul."
"If social media makes us too distracted to think, AI makes us too lazy to try. Together, they create a 'cognitive bypass' where the human mind becomes a mere coordinator of digital inputs rather than a generator of original value."
"The danger is not that AI removes thinking, but that we might fail to recognize the new, higher-level forms of thinking that are emerging in its wake."
"Human creativity is not merely a process of output; it is an act of translation."
Posts
Manifesto for "technical sovereignty"—localized, air-gapped infrastructure as a "backup brain" for communities when cloud connectivity fails. Analyzes a proposed "Tech Node" for Portage County through four labor pillars: Knowledge Architects (curators of the local data vault), Hardware Riggers (builders of the "$1,000 Community AI Box"), Industrial Experts (the end-users—farmers and machinists the system must actually serve), and Systems Ops (guardians against software rot). Introduces "Permacomputing"—applying permaculture principles to technology: care for chips as finite resources, energy minimalism, hardware longevity. Critiques the "Innovation Delusion" (Russell & Vinsel): our culture celebrates the new build and undervalues invisible maintenance labor. Key ethical challenge: the "Archival Banditry" of the local vault—when Knowledge Architects decide what to save, they create "Archival Silences" that risk cultural amnesia. "A lifeboat that leaks is just a coffin with a view."
Complicates the "Digital Lifeboat" metaphor: the internet itself is built on servers with expiration dates. The Digital Dark Age—Vint Cerf's warning that 21st-century data will vanish through bit rot and dependency collapse (a file requiring a specific app on a specific OS on specific hardware). Who gets a seat on the archival lifeboat? Tensions between "comprehensive" (Internet Archive: collect everything) and "selective" (national libraries: curate for enduring value) approaches. For marginalized communities, this is existential: comprehensive scraping replicates mainstream biases, producing "symbolic annihilation." The "right to be forgotten" (Viktor Mayer-Schönberger's Delete) vs. the need for archives that validate marginalized identities and document state violence. Hidden carbon toll: cloud storage consumes 2.5-3.7% of global GHG emissions—rivaling aviation. Concludes: move from being passengers on corporate ships to architects of a digital commons.
Challenges the "phone addiction" narrative by reversing causality: teens aren't retreating into phones because they prefer digital connection, but because the physical world has evicted them. Ray Oldenburg's "Third Place" concept (the community anchor—coffee shops, parks, corner stores) was zoned out of existence by suburban design: subdivisions miles from commercial zones, no sidewalks, and "hostile architecture" (benches designed to be uncomfortable, "Mosquito" devices emitting high-frequency tones to drive away teens). The smartphone becomes a "lifeboat"—the last Third Place that doesn't charge admission, require a driver's license, or treat teenagers as a nuisance. Diagnosis: this is a hardware problem, not a software problem. "Until we give this generation a place to go in the real world, we can't blame them for living in the digital one."
Response to Gabriel Bell's "The Ghost in the Canvas." Argues the perfectionism crisis predates AI: Instagram and TikTok trained us to curate "highlight reels" that breed a fear of error long before generative tools arrived. Draws on Bayles and Orlando's Art & Fear ("Error is human. Art is error"), a Monash University study showing failure is "integral to creativity," and Carol Dweck's growth mindset—"Becoming is better than being." When we skip the "bad" parts using any technology (filters or generators), we skip the lesson. The real rebellion is embracing the glitch: "Post the blurry photo. Write the awkward poem. Paint the crooked tree." In an age of artificial perfection, our flaws are our "watermark"—the proof a human was here.
Response to Jonas Rodrigues's "Cognitive Atrophy." Draws on Walter Benjamin's concept of artistic "aura" and Jaron Lanier's "You Are Not a Gadget" to argue that AI-generated art lacks intent and therefore authenticity. Gamification of music shifts value from the artifact to the act of generation -- the "user" feels a rush of creation but has risked nothing and communicated nothing. AI threatens live performance by retraining audiences for instant gratification. But AI as "digital apprentice" for technical tasks can free artists for conceptual work. "To protect the future of creativity, we must resist the urge to turn art into a slot machine."
Extends Sam Levine's social media vs. AI addiction argument. Identifies the "loud" addiction (social media's intermittent variable rewards, per Adam Alter's "Irresistible") and the "quiet" addiction (AI as a drug of convenience, per Jaron Lanier). Cal Newport's "Deep Work" frames the stakes: the ability to focus without distraction is becoming a "superpower." Proposes three countermeasures: Analog First (never start a project with a prompt), Engineering "Stopping Cues" (app timers, phone-free zones), and the Socratic AI Method (ask AI to critique your argument, not write it).
Introduces the "pulse" concept -- the underlying intent, scars of lived experience, and soul that remains fundamentally human. Draws on Margaret Boden's creativity taxonomy: AI excels at combinational and exploratory creativity but struggles with transformational creativity requiring deep understanding of value and context. Productive integration means using AI as "sparring partner" for brainstorming and research while humans remain the architects. "Let us embrace the machine's rhythm, but never forget that the music belongs to us."
Examines the paradox of possessing tools that generate "masterpieces" in seconds while we protect the human spark that takes a lifetime to refine. Human creativity is an act of translation -- AI cannot replicate it because AI doesn't "experience" the world. LLMs function through stochastic parity: mirrors, not windows. But AI can be a "brainstorming partner" helping overcome the blank page. The individual must remain the primary driver of narrative, emotion, and final decision-making.
Response to Isabella Calmet's "The Tomorrow Code." Extends the bioethics conversation: while genome editing can eradicate congenital disorders, framing certain traits as defects to "fix" risks devaluing people currently living with those conditions. Cites the disability rights "Social Model" -- people are disabled more by societal barriers than impairments. Also raises the "Genetic Gap": if CRISPR remains expensive, we move from social classes to "biological castes."
Response to Jeffrey Way's "I'm Done." Connects Way's professional pivot to pedagogical arguments. The 40% staff reduction validates the urgency of AI integration in education. The shift from "creator" to "orchestrator" mirrors the shift from memorization to critical analysis. "Vibe coding" only works if the human has deep-domain expertise to catch "visual debt" in AI output. The skill of the 2030s will be "discernment."
Response to Gabriel Bell's value essay. Art's true worth lies in incommensurable qualities: challenging perception, fostering empathy, recording the feeling of history. AI-generated art is "hollow" -- a statistical arrangement lacking emotional intentionality. Art is a "social contract between two conscious minds"; replacing artists with AI is the ultimate "hyper-commodification."
Extends Dominic Debro's "Dopamine Ceiling" through Dr. Anna Lembke's "Dopamine Nation." Constant overstimulation causes downregulation of dopamine receptors, leading to "digital anhedonia" where the physical world feels muted. This has devastating effects on our relationship with art -- historically requiring a "slow gaze," art cannot compete with engineered digital intensity.
Responds to Dr. Plate's "The Ocean in the Chatbot." Recontextualizes AI's environmental footprint against global industrial data. Compares textile industry (20% of global wastewater), banking (263 TWh/year), gold mining (131 TWh) against AI data centers. Pivots to showing AI as environmental solution: DeepMind reduced Google data center cooling by 40%; precision farming reduces water waste.
Engages Ted Chiang's "forklift in the weight room" metaphor alongside Steve Yegge's "vibe coding." Uses MIT Media Lab's "Your Brain on ChatGPT" study on cognitive debt -- AI reduces extraneous load allowing accumulation of germane load. Acknowledges the risk: Microsoft Research found AI reliance reduces "perceived effort of critical thinking."
Analysis of Dr. Plate's "What a Ban Can't Teach." Banning tools removes necessity of judgment. Details what integration teaches: norming (developing own rubrics), calibration (comparing analysis to AI's), information management at scale. "Network accountability" guards against generic AI output better than prohibition.
Key Sources Engaged
Ted Chiang - Princeton CDH lecture (March 2025)
Steve Yegge - "Vibe Coding Manifesto" (Latent Space Podcast)
Walter Benjamin - "The Work of Art in the Age of Mechanical Reproduction"
Jaron Lanier - "You Are Not a Gadget"
Cal Newport - "Deep Work"
Adam Alter - "Irresistible"
Dr. Anna Lembke - "Dopamine Nation"
Margaret Boden - Creativity taxonomy
MIT Media Lab - "Your Brain on ChatGPT"
Network Connections
Responds to: Dr. Plate's "The Ocean in the Chatbot," "What a Ban Can't Teach"; Dominic Debro's "The Dopamine Ceiling"; Gabriel Bell's value essays; Isabella Calmet's "The Tomorrow Code"; Jeffrey Way's "I'm Done"; Sam Levine's "What's More Addicting?"; Jonas Rodrigues's "Cognitive Atrophy"
Extended by: Sam Levine in "Forklifts and Fundamentals" (refines the forklift metaphor using Ethan Mollick)
In dialogue with: Jonas Rodrigues -- Eliana and Jonas are building a sustained conversation about creativity and AI. His "Cognitive Atrophy" post on music gamification prompted her "Slot-Machine Symphony" response, extending the argument through Benjamin's "aura" concept. Both share concern about deskilling but Eliana sees more room for productive AI-as-apprentice use.
Thematic overlap: Jacob Brunts (environmental data, competency gap), Dominic Debro (cognitive relocation, dopamine ceiling), Jinx Hixson (loud lie economy), Sam Levine (dual addiction framework)