Discover more from Reboot
⚡ Replies To: Take Back the Future!
How eight technologists see their future and ours
Today at 5pm PT, we’re chatting with Stanford professors Rob Reich, Mehran Sahami, and Jeremy Weinstein about their book System Error and their broader work reimagining education on CS, politics, and ethics for undergrads and tech workers nationwide. It’ll be a fascinating conversation that moves between a range of disciplines and from theory to practice.
The event will be facilitated by Reboot community members Ben Wolfson and Shira Abramovich. You can RSVP to the Q&A here:
🗯 reboot replies: take back the future!
Edited by Jessica Dai
We asked Reboot readers and community members to reflect on and respond to Jasmine's manifesto for Reboot, "Take Back the Future." I'm really excited to share this range of responses.
on ideology itself
Caden Felton on seeking ideology and community
For a long time, I've said that I wanted to change the world — that I wanted to make an impact. It sounded good and righteous, but it was an illuminating moment when I realized that I had no idea what that meant and was just grasping for a legacy. Since then, I've been searching for communities and ideologies to help me thoughtfully imagine the future I wanted to play a part in shaping.
The technology community was the first I encountered; I was immediately enthralled by the grandiose visions of an abundant future. But, as Jasmine aptly describes, blind techno-utopianism fueled by the same studio that brought you "Move Fast and Break Things" seemed to lack depth and cast aside hard critical considerations. But the opposite, unconstructive pessimism, seemed uninspiring and defeatist. I continued looking for communities; Effective Altruism (EA) was a particularly notable one.
I admire EA's mission to maximize the welfare of all beings, and I've directed time, effort, and resources to EA-prescribed goals. But I'm still early in formalizing my beliefs, and I'm yet to accept any wholesale — Is life an optimization problem? What does it mean for something to be high impact? How do we find value beyond instrumentalization? I'm far from answering these, and I've added them to the questions prompted to me as a young person in tech.
Should we harness powerful technologies, dream of utopias, and reach for the stars? Do we desperately need deep reflection into how the technology we're building interfaces with the society that uses it? It doesn't seem impossible for the answer to both to be yes.
Vikram Singh on the dangers of techno-optimism
Democratisation of power must happen within the design, development and delivery of technologies. Yet lingering problems of the technologist's mindset remain. Heidegger said the modern world is 'revealed' to us not as something with its own integrity and purpose, but rather as a resource or 'standing reserve' to exploit; current technologists, especially optimists, see the world 'revealed' to them via digital tech.
The issue with technologists (and as a UX designer, I include myself), and especially tech optimists, is that they cannot extricate themselves from this mindset. For them, the world they come from, including its language, infrastructure and habits, prefigures how their solutions (financial, social, environmental, or otherwise) will unfold. Problems are created, and tech is inevitably the answer. You might call this solutionism — but it's more than that, because it's also about the creation and abstraction of problems.
There are other ways of being in the world — aligning, situating, de-growing, or simply doing less.
Ivan Zhao on building institutions
When you grow up gay and Chinese, there’s not a lot of people that look like you in traditional media; the internet felt like a space I could participate in. On the internet, I was able to interact with people through early messaging apps, comments, and other media — in spaces where I felt myself.
What should the next generation of technologists and technology look like? I’m reminded of a New Yorker piece in response to Marc Andreessen’s infamous “It’s Time To Build”: Anna Wiener argues that Andreessen wants to build better institutions. Taken seriously, Andreessen seemed to be suggesting an entirely new version of Silicon Valley: a movement away from making software to making institutions. Jasmine’s essay points directly to this, that those creating the future are yearning for an opportunity to lead the world, but are fundamentally jaded and disappointed with our current institutions.
Growing up, tech was a window to understanding my identity. I’m convinced that taking back the future means that we need to take it back for everyone, especially those that have been in historically marginalized groups. We need to create institutions that uplift them. The narrative shouldn’t focus on the next DTC backpack startup; it should be on organizations that center their goals around individuals, collectives, and groups.
on individual career choices
Scott Fitsimones on why consumer choices matter more
Rather than expecting technologists to implement Jasmine's ideas simply out of goodwill, we should be asking how to create a market for these ideas. From worker-owned products to combatting algorithmic bias, the ideas she proposes won't get built until consumers — not just technologists — change their minds about what they value. The hero of the story is not the newly awoken technologist who quits a big tech job to build a worker-owned cooperative: it is the customer who chooses the co-op.
The current narrative is at risk of creating two classes of tech workers: a privileged class that can quit their well-paying jobs and question the system, and a guilty class that depends on a paycheck to send money back home or pay off debt, and yet are shunned by the enlightened privileged class. If the market wants something to be built, it's inevitable to some degree that it will be; it's just a question of what at cost. Given that inevitability, let us place the burden on not only the ethical technologist, but on the ethical consumer, so that the market as a whole can move in a better direction.
Anthony Tan on why "sellout" shouldn't be a dirty word
Staying ideologically pure was much easier in the classroom. The problem with the term “sellout” is that it can be applied to anyone doing anything linked to the structures of capital and power; the “system.” A glance at my resume will show that I’ve struggled with this question myself, working in fields from nonprofit and academia to corporations and startups.
In most cases, the term is counterproductive. “Selling out” categorizes people into good and evil. It suggests that either you wholly dedicate your life to social issues, or you are selfish. This moral binary is false and toxic. Calling people out makes enemies of potential allies. Calling people in is wiser, reconciliatory, and more productive: it invites their support and reinforces your basic faith in humanity. Whether as leaders or followers, every group in society has a part to play in social progress.
To those still cynical, I would say that engaging with power — critiquing, reforming, and yes, gaining and sharing it — in the right ways, for the right reasons, is not selling out. It’s admirable.
Mathurah Ravigulan on what it means to work for "social good"
Going into a role at a company with a “social good” mission, I expected to be talking about the problem at hand every day, proposing and building features that’ll make our users' lives better. To my dismay, I was farther removed from the actual problem than I could've ever imagined.
After spending so much time in meetings discussing how much padding needs to go on the new sign-up button, I wondered what my purpose was. Why do we spend so much time discussing design systems, instead of designing actual systems that solve systemic problems? It almost feels like a waste of time and resources to have so many smart people in a room have strong opinions on software designs that won’t even matter after 5pm. But if there isn’t anyone working on these problems, we won’t have well-designed software.
Telling myself I was contributing to a greater mission helped me feel less guilty about selling out, but I’m realizing that a lot of tech roles are the same — social good or not. You’re going to optimize for user sign-ups, fix bugs, and argue about design systems no matter where you go. In other words, I was fueling my techno-optimism into the wrong things.
If I approach all our current systems with cynicism, that’s exactly what I'll get back from them. Instead, I want to put my time into communities, not companies — and create change instead of expecting it.
on the path forward
Scott Moore on decentralized autonomous organizations (DAOs)
For those skeptical that technology can empower us, l can only say that it must: each time a new technology is created, there is no going back. Our failure to recognize the paradigm shifts we exist within can sometimes lead to catastrophic consequences. (Facebook 10 years ago surely never sought to eventually influence elections.) Many of those who want to see change are making a grave mistake by refusing to use technology’s speed and scale to empower people.
DAOs and other internet-native organizations feel like one great example of tools that have the potential to allow us to keep one foot in existing systems while building out new ones in parallel, new modes of cooperative governance that adhere to principles we care about such as those set out in Rochdale. The beauty of DAOs is that they are, at least in definition, tabula rasa, and can be experimented with and implemented without any of the baggage that even well-meaning cooperative systems like Rochdale can come with.
Critically, though, blank slates come with risk. Once we make our marks, we cannot undo them, and we must examine these systems carefully as we build them. DAOs, like many technologies, have many potential problems, but ignoring them is simply irresponsible. For better or worse, we live in a world of institutions and tools, and we must shape the tools we want to use.
Spencer Chang on sharing the magic
I’m heavily drawn to ideals, so nothing drew me more than the shining G, which had started to become the face of everyone’s experience with the internet — I even wrote private love letters to Google and binged the “If Google Was A Guy” series. When I came to Silicon Valley for the first time, my expectations were finally put to the test in reality. I found myself at the center of a paradox, a dissonance between what I believed and what I discovered.
When technology first took over the world, it felt like magic because it was beyond comprehension for the vast majority of society. In order for us to seize technology for our own benefit rather than immense corporations and governments, we, as technologists, have the responsibility to make our technological tools malleable. That is, we must make it both possible and approachable for everyone, not just technologists, to decode the black box of our technological magic and to adapt it to work towards our needs.
As technologists, we hold the keys to the magic, but the utopian demand I’d make is for everyone to hold the keys to the magic. Rather than giving cake to the masses, we must give them the recipe. I demand a future where the magic is ours, the people’s, rather than belonging to an elite class of wizards and faceless entities. The software we interact with every day should not only allow, but encourage us us to make it our own: to create our thoughts, art, and experiments rather than merely consume SEO-juiced content; to express our full selves rather than settling for uniform cookie-cutter molds; to play outside the box and in between the lines and beyond the edges of normality rather than being constrained within the rules of the status quo. There have been prior projects and experiments exploring a utopian-like vision in this space, but we need dedicated focus to shift the power balance in technology back to end users and in a form that is accessible to all.
Caden Felton is a software engineer currently taking time off from school to work at a startup in NYC. He's fascinated by cities and their future, mechanism design, economic development, and how we can make humanity a love-based species — reach out if you are too :)
Vikram Singh is a UX designer and researcher.
Ivan Zhao is a creative technologist and a partner at Dorm Room Fund.
Scott Fitsimones is a founder, builder, urbanist, and musician who is passionate about better cities, affordable housing, and mechanism design.
Anthony Tan is a recent business, sustainability, and philosophy grad, and co-founder of open-source VR social network ROVR.
Mathurah Ravi is a 3rd year Systems Design Engineering student at the University of Waterloo who is interested in product and venture.
Scott Moore is a co-founder of Gitcoin, a community focused on building and sustaining digital public goods. He has also participated in a number of other related initiatives such as the Digital Public Goods Alliance and SustainOSS.
Spencer Chang is a technologist, writer, and stubborn dreamer who is passionate about empowering people to leverage software for their needs, exploring ways for us to cultivate meaning and fun in our digital spaces, and cataloguing the best sunsets.
Are there stamp collectors in the metaverse? Kyle Chayka explores how digital platforms and changing, algorithmic feeds have blocked our ability to curate — and thus interpret — the artifacts and experiences that define our lives.
a16z just launched a massive web3 policy hub (read: crypto lobbying arm) to accompany their burgeoning web3 startup and media portfolio.
“New money,” or the myth of self-made wealth, is core to Silicon Valley’s identity. Adrian Daub explores the cracks in this bootstrapped narrative by unraveling the generational dynasties behind modern tech success.
💝 a closing note
We’d love to use our publication as a platform for thoughtful conversation and debate among our community of readers, and are experimenting with new formats to share more perspectives with you.
If you liked the “reply” format or have other ideas, hit reply to this email to let us know.
Toward plural futures,
Jasmine & Reboot team