The internet is broken—but it doesn’t have to be. If you’re concerned about how surveillance, online advertising, and automated content moderation are hurting us online and offline, the Electronic Frontier Foundation’s How to Fix the Internet podcast offers a better way forward. EFF has been defending your rights online for over thirty years and is behind many of the biggest digital rights protections since the invention of the internet. Through curious conversations with some of the leading minds in law and technology, this podcast explores creative solutions to some of today’s biggest tech challenges. Hosted by EFF Exe...
Bonus · Sun, March 23, 2025
This episode was first released on May 2, 2023. Dr. Seuss wrote a story about a Hawtch-Hawtcher Bee-Watcher whose job it is to watch his town’s one lazy bee, because “a bee that is watched will work harder, you see.” But that doesn’t seem to work, so another Hawtch-Hawtcher is assigned to watch the first, and then another to watch the second... until the whole town is watching each other watch a bee. To Federal Trade Commissioner Alvaro Bedoya, the story—which long predates the internet—is a great metaphor for why we must be wary of workplace surveillance, and why we need to strengthen our privacy laws. Bedoya has made a career of studying privacy, trust, and competition, and wishes for a world in which we can do, see, and read what we want, living our lives without being held back by our identity, income, faith, or any other attribute. In that world, all our interactions with technology —from social media to job or mortgage applications—are on a level playing field. Bedoya speaks with EFF’s Cindy Cohn and Jason Kelley about how fixing the internet should allow all people to live their lives with dignity, pride, and purpose. In this episode, you’ll learn about: The nuances of work that “bossware,” employee surveillance technology, can’t catch. Why the Health Insurance Portability Accountability Act (HIPAA) isn’t the privacy panacea you might think it is. Making sure that one-size-fits-all privacy rules don’t backfire against new entrants and small competitors. How antitrust fundamentally is about small competitors and working people, like laborers and farmers, deserving fairness in our economy. Alvaro Bedoya was nominated by President Joe Biden, confirmed by the U.S. Senate, and sworn in May 16, 2022 as a Commissioner of the Federal Trade Commission ; his term expires in September 2026. Bedoya was the founding director of the Center on Privacy & Technology at Georgetown University Law Center , where he was also a visiting professor of law. He has been influential in research and policy at the intersection of privacy and civil rights, and co-authored a 2016 report on the use of facial recognition by law enforcement and the risks that it poses. He previously served as the first Chief Counsel to the Senate Judiciary Subcommittee on Privacy, Technology and the Law after its founding in 2011, and as Chief Counsel to former U.S. Sen. Al Franken (D-MN); earlier, he was an associate at the law firm WilmerHale . A naturalized immigrant born in Peru and raised in upstate New York, Bedoya previously co-founded the Esperanza Education Fund , a college scholarship for immig
Bonus · Fri, October 11, 2024
This episode was first released on March 21, 2023. The promise of the internet was that it would be a tool to melt barriers and aid truth-seekers everywhere. But it feels like polarization has worsened in recent years, and more internet users are being misled into embracing conspiracies and cults. From QAnon to anti-vax screeds to talk of an Illuminati bunker beneath Denver International Airport, Alice Marwick has heard it all. She has spent years researching some dark corners of the online experience: the spread of conspiracy theories and disinformation. She says many people see conspiracy theories as participatory ways to be active in political and social systems from which they feel left out, building upon beliefs they already harbor to weave intricate and entirely false narratives. Marwick speaks with EFF’s Cindy Cohn and Jason Kelley about finding ways to identify and leverage people’s commonalities to stem this flood of disinformation while ensuring that the most marginalized and vulnerable internet users are still empowered to speak out. In this episode you’ll learn about: Why seemingly ludicrous conspiracy theories get so many views and followers How disinformation is tied to personal identity and feelings of marginalization and disenfranchisement When fact-checking does and doesn’t work Thinking about online privacy as a political and structural issue rather than something that can be solved by individual action Alice Marwick is director of research at Data & Society. Previously she was an Associate Professor in the Department of Communication and cofounder and Principal Researcher at the Center for Information, Technology and Public Life at the University of North Carolina, Chapel Hill. She researches the social, political, and cultural implications of popular social media technologies. In 2017, she co-authored Media Manipulation and Disinformation Online (Data & Society), a flagship report examining far-right online subcultures’ use of social media to spread disinformation, for which she was named one of Foreign Policy magazine’s 2017 Global Thinkers . She is the author of Status Update: Celebrity, Publicity and Branding in the Social Media Age (Yale 2013), an ethnographic study of the San Francisco tech scene which examines how people seek social status through online visibility, and co-editor of The Sage Handbook of Social Media (Sage 2017). Her forthcomin
S5 E10 · Tue, July 02, 2024
The early internet had a lot of “technological self-determination" — you could opt out of things, protect your privacy, control your experience. The problem was that it took a fair amount of technical skill to exercise that self-determination. But what if it didn’t? What if the benefits of online privacy, security, interoperability, and free speech were more evenly distributed among all internet users? This is the future that award-winning author and EFF Special Advisor Cory Doctorow wants us to fight for. His term “enshittification” — a downward spiral in which online platforms trap users and business customers alike, treating them more and more like commodities while providing less and less value — was selected by the American Dialect Society as its 2023 Word of the Year. But, he tells EFF’s Cindy Cohn and Jason Kelley, enshittification analysis also identifies the forces that used to make companies treat us better, helping us find ways to break the cycle and climb toward a better future. In this episode you’ll learn about: Why “intellectual property” is a misnomer, and how the law has been abused to eliminate protections for society How the tech sector’s consolidation into a single lobbying voice helped bulldoze the measures that used to check companies’ worst impulses Why recent antitrust actions provide a glimmer of hope that megacompanies can still be forced to do better for users Why tech workers’ labor rights are important to the fight for a better internet How legislative and legal losses can still be opportunities for future change Cory Doctorow is an award-winning science fiction author, activist, journalist and blogger, and a Special Advisor to EFF. He is the editor of Pluralistic and the author of novels including “ The Bezzle ” (2024), “ The Lost Cause ” (2023), “ Attack Surface ” (2020), and “ Walkaway ” (2017); young adult novels including “ Homeland ” (2013) and “ Little Brother ” (2008); and nonfiction books including “ The Internet Con: How to Seize the Means of Computation ” (2023) and “<a href="ht
S5 E9 · Tue, June 18, 2024
Artificial intelligence will neither solve all our problems nor likely destroy the world, but it could help make our lives better if it’s both transparent enough for everyone to understand and available for everyone to use in ways that augment us and advance our goals — not for corporations or government to extract something from us and exert power over us. Imagine a future, for example, in which AI is a readily available tool for helping people communicate across language barriers, or for helping vision- or hearing-impaired people connect better with the world. This is the future that Kit Walsh, EFF’s Director of Artificial Intelligence & Access to Knowledge Legal Projects, and EFF Senior Staff Technologist Jacob Hoffman-Andrews, are working to bring about. They join EFF’s Cindy Cohn and Jason Kelley to discuss how AI shouldn’t be a tool cash in, or to classify people for favor or disfavor, but instead to engage with technology and information in ways that advance us all. In this episode you’ll learn about: The dangers in using AI to determine who law enforcement investigates, who gets housing or mortgages, who gets jobs, and other decisions that affect people’s lives and freedoms. How "moral crumple zones” in technological systems can divert responsibility and accountability from those deploying the tech. Why transparency and openness of AI systems — including training AI on consensually obtained, publicly visible data — is so important to ensure systems are developed without bias and to everyone’s benefit. Why “watermarking” probably isn’t a solution to AI-generated disinformation. Kit Walsh is a senior staff attorney at EFF, serving as Director of Artificial Intelligence & Access to Knowledge Legal Projects. She has worked for years on issues of free speech, net neutrality, copyright, coders' rights, and other issues that relate to freedom of expression and access to knowledge, supporting the rights of political protesters, journalists, remix artists, and technologists to agitate for social change and to express themselves through their stories and ideas. Before joining EFF, Kit led the civil liberties and patent practice areas at the Cyberlaw Clinic , part of Harvard University's Berkman Klein Center for Internet and Society; earlier, she worked at the law firm of Wolf, Greenfield & Sacks , litigating patent, trademark, and copyright cases in courts across the country. Kit holds a J.D. from Harvard Law School and a B.S. in neuroscience from MIT, where she studied brain-computer interfaces and designed cyborgs and artificial bacteria. Jacob Hoffman-Andrews is a senior staff technologist at EFF, where he is lead developer on Let's Encrypt , the free and automated Certificate Authority; he also works on EFF's<a href="https://www.e
S5 E8 · Tue, June 04, 2024
Collaging, remixing, sampling—art always has been more than the sum of its parts, a synthesis of elements and ideas that produces something new and thought-provoking. Technology has enabled and advanced this enormously, letting us access and manipulate information and images in ways that would’ve been unimaginable just a few decades ago. For Nettrice Gaskins, this is an essential part of the African American experience: The ability to take whatever is at hand—from food to clothes to music to visual art—and combine it with life experience to adapt it into something new and original. She joins EFF’s Cindy Cohn and Jason Kelley to discuss how she takes this approach in applying artificial intelligence to her own artwork, expanding the boundaries of Black artistic thought. In this episode you’ll learn about: Why making art with AI is about much more than just typing a prompt and hitting a button How hip-hop music and culture was an early example of technology changing the state of Black art Why the concept of fair use in intellectual property law is crucial to the artistic process How biases in machine learning training data can affect art Why new tools can never replace the mind of a live, experienced artist Dr. Nettrice R. Gaskins is a digital artist, academic, cultural critic, and advocate of STEAM (science, technology, engineering, arts, and math) fields whose work she explores "techno-vernacular creativity" and Afrofuturism. She teaches, writes, "fabs,” and makes art using algorithms and machine learning. She has taught multimedia, visual art, and computer science with high school students, and now is assistant director of the Lesley STEAM Learning Lab at Lesley University. She was a 2021 Ford Global Fellow , serves as an advisory board member for the School of Literature, Media, and Communication at Georgia Tech, and is the author of “ Techno-Vernacular Creativity and Innovation ” (2021). She earned a BFA in Computer Graphics with honors from Pratt Institute in 1992; an MFA in Art and Technology from the School of the Art Institute of Chicago in 1994; and a doctorate in Digital Media from Georgia Tech in 2014. MUSIC CREDITS Xena's Kiss / Medea's Kiss by mwic (c) copyright 2018 Licensed under a Creative Commons Attribution (3.0) license. _________________ lostTrack by Airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft. mwic
S5 E7 · Tue, May 21, 2024
From Napster to YouTube, some of the most important and controversial uses of the internet have been about building community: connecting people all over the world who share similar interests, tastes, views, and concerns. Big corporations try to co-opt and control these communities, and politicians often promote scary narratives about technology’s dangerous influences, but users have pushed back against monopoly and rhetoric to find new ways to connect with each other. Alex Winter is a leading documentarian of the evolution of internet communities. He joins EFF’s Cindy Cohn and Jason Kelley to discuss the harms of behavioral advertising, what algorithms can and can’t be blamed for, and promoting the kind of digital literacy that can bring about a better internet—and a better world—for all of us. In this episode you’ll learn about: Debunking the monopolistic myth that communicating and sharing data is theft. Demystifying artificial intelligence so that it’s no longer a “black box” impervious to improvement. Decentralizing and democratizing the internet so more, diverse people can push technology, online communities, and our world forward. Finding a nuanced balance between free speech and harm mitigation in social media. Breaking corporations’ addiction to advertising revenue derived from promoting disinformation. Alex Winter is a director, writer and actor who has worked across film, television and theater. Perhaps best known on screen for “Bill & Ted’s Excellent Adventure” (1989) and its sequels as well as “The Lost Boys” (1987), “Destroy All Neighbors” (2024) and other films, he has directed documentaries including “ Downloaded ” (2013) about the Napster revolution; “ Deep Web ” (2015) about the online black market Silk Road and the trial of its creator Ross Ulbricht; “ Trust Machine ” (2018) about the rise of bitcoin and the blockchain; and “ The YouTube Effect ” (2022). He also has directed critically acclaimed documentaries about musician Frank Zappa and about the Panama Papers, the biggest global corruption scandal in history and the journalists who worked in secret and at great risk to break the story. Music credits: Perspectives *** by J.Lang (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft: Sackjo22 and Admiral Bob
S5 E6 · Tue, May 07, 2024
Blind and low-vision people have experienced remarkable gains in information literacy because of digital technologies, like being able to access an online library offering more than 1.2 million books that can be translated into text-to-speech or digital Braille. But it can be a lot harder to come by an accessible map of a neighborhood they want to visit, or any simple diagram, due to limited availability of tactile graphics equipment, design inaccessibility, and publishing practices. Chancey Fleet wants a technological future that’s more organically attuned to people’s needs, which requires including people with disabilities in every step of the development and deployment process. She speaks with EFF’s Cindy Cohn and Jason Kelley about building an internet that’s just and useful for all, and why this must include giving blind and low-vision people the discretion to decide when and how to engage artificial intelligence tools to solve accessibility problems and surmount barriers. In this episode you’ll learn about: The importance of creating an internet that’s not text-only, but that incorporates tactile images and other technology to give everyone a richer, more fulfilling experience. Why AI-powered visual description apps still need human auditing. How inclusiveness in tech development is always a work in progress. Why we must prepare people with the self-confidence, literacy, and low-tech skills they need to get everything they can out of even the most optimally designed technology. Making it easier for everyone to travel the two-way street between enjoyment and productivity online. Chancey Fleet’s writing, organizing and advocacy explores how cloud-connected accessibility tools benefit and harm, empower and expose communities of disability. She is the Assistive Technology Coordinator at the New York Public Library’s Andrew Heiskell Braille and Talking Book Library , where she founded and maintains the Dimensions Project , a free open lab for the exploration and creation of accessible images, models and data representations through tactile graphics, 3D models and nonvisual approaches to coding, CAD and “visual” arts. She is a former fellow and current affiliate-in-residence at Data & Society ; she is president of the National Federation of the Blind’s Assistive Technology Trainers Division ; and she was recognized as a 2017 Library Journal Mover and Shaker . Music credits: Probably Shouldn't by J.Lang (c) copyright 2019 Licensed under
S5 E5 · Tue, April 23, 2024
If you buy something—a refrigerator, a car, a tractor, a wheelchair, or a phone—but you can't have the information or parts to fix or modify it, is it really yours? The right to repair movement is based on the belief that you should have the right to use and fix your stuff as you see fit, a philosophy that resonates especially in economically trying times, when people can’t afford to just throw away and replace things. Companies for decades have been tightening their stranglehold on the information and the parts that let owners or independent repair shops fix things, but the pendulum is starting to swing back: New York, Minnesota, California, and Colorado have passed right to repair laws, and it’s on the legislative agenda in dozens of other states. Gay Gordon-Byrne is executive director of The Repair Association, one of the major forces pushing for more and stronger state laws, and for federal reforms as well. She joins EFF’s Cindy Cohn and Jason Kelley to discuss this pivotal moment in the fight for consumers to have the right to products that are repairable and reusable. In this episode you’ll learn about: Why our “planned obsolescence” throwaway culture doesn’t have to be, and shouldn’t be, a technology status quo. The harm done by “parts pairing:” software barriers used by manufacturers to keep people from installing replacement parts. Why one major manufacturer put out a user manual in France, but not in other countries including the United States. How expanded right to repair protections could bring a flood of new local small-business jobs while reducing waste. The power of uniting disparate voices—farmers, drivers, consumers, hackers, and tinkerers—into a single chorus that can’t be ignored. Gay Gordon-Byrne has been executive director of The Repair Association —formerly known as The Digital Right to Repair Coalition—since its founding in 2013, helping lead the fight for the right to repair in Congress and state legislatures. Their credo: If you bought it, you should own it and have the right to use it, modify it, and repair it whenever, wherever, and however you want. Earlier, she had a 40-year career as a vendor, lessor, and used equipment dealer for large commercial IT users; she is the author of " Buying, Supporting and Maintaining Software and Equipment - an IT Manager's Guide to Controlling the Product Lifecycle ” (2014), and a Colgate University alumna. MUSIC CREDITS Come Inside by Zep Hurme (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft: snowflake Drops of H2O ( The Filtered Water Treatment ) by J.Lang
S5 E4 · Tue, April 09, 2024
Imagine an internet in which economic power is more broadly distributed, so that more people can build and maintain small businesses online to make good livings. In this world, the behavioral advertising that has made the internet into a giant surveillance tool would be banned, so people could share more equally in the riches without surrendering their privacy. That’s the world Tim Wu envisions as he teaches and shapes policy on the revitalization of American antitrust law and the growing power of big tech platforms. He joins EFF’s Cindy Cohn and Jason Kelley to discuss using the law to counterbalance the market’s worst instincts, in order to create an internet focused more on improving people’s lives than on meaningless revenue generation. In this episode you’ll learn about: Getting a better “deal” in trading some of your data for connectedness. Building corporate structures that do a better job of balancing the public good with private profits. Creating a healthier online ecosystem with corporate “quarantines” to prevent a handful of gigantic companies from dominating the entire internet. Nurturing actual innovation of products and services online, not just newer price models. Timothy Wu is the Julius Silver Professor of Law, Science and Technology at Columbia Law School, where he has served on the faculty since 2006. First known for coining the term “ net neutrality ” in 2002, he served in President Joe Biden’s White House as special assistant to the President for technology and competition policy from 2021 to 2023; he also had worked on competition policy for the National Economic Council during the last year of President Barack Obama’s administration. Earlier, he worked in antitrust enforcement at the Federal Trade Commission and served as enforcement counsel in the New York Attorney General’s Office . His books include “ The Curse of Bigness: Antitrust in the New Gilded Age ” (2018), " The Attention Merchants: The Epic Scramble to Get Inside Our Heads ” (2016), “ The Master Switch: The Rise and Fall of Information Empires ” (2010), and “ Who Controls the Internet? Illusions of a Borderless World ” (2006). MUSIC CREDITS Perspectives *** by J.Lang (c) copyright 20
S5 E3 · Tue, March 26, 2024
Is your face truly your own, or is it a commodity to be sold, a weapon to be used against you? A company called Clearview AI has scraped the internet to gather (without consent) 30 billion images to support a tool that lets users identify people by picture alone. Though it’s primarily used by law enforcement, should we have to worry that the eavesdropper at the next restaurant table, or the creep who’s bothering you in the bar, or the protestor outside the abortion clinic can surreptitiously snap a pic of you, upload it, and use it to identify you, where you live and work, your social media accounts, and more? New York Times reporter Kashmir Hill has been writing about the intersection of privacy and technology for well over a decade; her book about Clearview AI’s rise and practices was published last fall. She speaks with EFF’s Cindy Cohn and Jason Kelley about how face recognition technology’s rapid evolution may have outpaced ethics and regulations, and where we might go from here. In this episode, you’ll learn about: The difficulty of anticipating how information that you freely share might be used against you as technology advances. How the all-consuming pursuit of “technical sweetness” — the alluring sensation of neatly and functionally solving a puzzle — can blind tech developers to the implications of that tech’s use. The racial biases that were built into many face recognition technologies. How one state's 2008 law has effectively curbed how face recognition technology is used there, perhaps creating a model for other states or Congress to follow. Kashmir Hill is a New York Times tech reporter who writes about the unexpected and sometimes ominous ways technology is changing our lives, particularly when it comes to our privacy. Her book, “ Your Face Belongs To Us ” (2023), details how Clearview AI gave facial recognition to law enforcement, billionaires, and businesses, threatening to end privacy as we know it. She joined The Times in 2019 after having worked at Gizmodo Media Group, Fusion, Forbes Magazine and Above the Law. Her writing has appeared in The New Yorker and The Washington Post. She has degrees from Duke University and New York University, where she studied journalism. This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators. This episode features: Kalte Ohren by Alex (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft: starfrosch & Jerry Spoon Drops of H2O (The Filtered Water Treatment ) by J.Lang (c) cop
S5 E2 · Tue, March 12, 2024
Imagine a world in which the internet is first and foremost about empowering people, not big corporations and government. In that world, government does “after-action” analyses to make sure its tech regulations are working as intended, recruits experienced technologists as advisors, and enforces real accountability for intelligence and law enforcement programs. Ron Wyden has spent decades working toward that world, first as a congressman and now as Oregon’s senior U.S. Senator. Long among Congress’ most tech-savvy lawmakers, he helped write the law that shaped and protects the internet as we know it, and he has fought tirelessly against warrantless surveillance of Americans’ telecommunications data. Wyden speaks with EFF’s Cindy Cohn and Jason Kelley about his “I squared” —individuals and innovation—legislative approach to foster an internet that benefits everyone. In this episode you’ll learn about: How a lot of the worrisome online content that critics blame on Section 230 is actually protected by the First Amendment Requiring intelligence and law enforcement agencies to get warrants before obtaining Americans’ private telecommunications data Why “foreign” is the most important word in “Foreign Intelligence Surveillance Act” Making government officials understand national security isn’t heightened by reducing privacy Protecting women from having their personal data weaponized against them U.S. Sen. Ron Wyden , D-OR, has served in the Senate since 1996; he was elected to his current six-year term in 2022. He chairs the Senate Finance Committee , and serves on the Energy and Natural Resources Committee, the Budget Committee, and the Select Committee on Intelligence; he also is the lead Senate Democrat on the Joint Committee on Taxation. His relentless defiance of the national security community's abuse of secrecy forced the declassification of the CIA Inspector General's 9/11 report , shut down the controversial Total Information Awareness program, and put a spotlight on both the Bush and Obama administrations’ reliance on "secret law." In 2006 he introduced t he first Senate bill on net neutralit y, and in 2011 he was the lone Senator to stand against the Stop Online Piracy Act (SOPA) and the PROTECT IP Act (PIPA), ultimately unsuccessful bills that purportedly were aimed at fighting online piracy but that act
S5 E1 · Tue, February 27, 2024
What if we thought about democracy as a kind of open-source social technology, in which everyone can see the how and why of policy making, and everyone’s concerns and preferences are elicited in a way that respects each person’s community, dignity, and importance? This is what Audrey Tang has worked toward as Taiwan’s first Digital Minister, a position the free software programmer has held since 2016. She has taken the best of open source and open culture and successfully used them to help reform her country’s government. Tang speaks with EFF’s Cindy Cohn and Jason Kelley about how Taiwan has shown that openness not only works but can outshine more authoritarian competition, wherein governments often lock up data. In this episode you’ll learn about: Using technology including artificial intelligence to help surface our areas of agreement, rather than to identify and exacerbate our differences The “radical transparency” of recording and making public every meeting in which a government official takes part, to shed light on the policy-making process How Taiwan worked with civil society to ensure that no privacy and human rights were traded away for public health and safety during the COVID-19 pandemic Why maintaining credible neutrality from partisan politics and developing strong public and civic digital infrastructure are key to advancing democracy. Audrey Tang has served as Taiwan's first Digital Minister since 2016, by which time she already was known for revitalizing the computer languages Perl and Haskell, as well as for building the online spreadsheet system EtherCalc in collaboration with Dan Bricklin. In the public sector, she served on the Taiwan National Development Council ’s open data committee and basic education curriculum committee and led the country’s first e-Rulemaking project. In the private sector, she worked as a consultant with Apple on computational linguistics, with Oxford University Press on crowd lexicography, and with Socialtext on social interaction design. In the social sector, she actively contributes to g0v (“gov zero”), a vibrant community focusing on creating tools for the civil society, with the call to “ fork the government.”
Trailer · Tue, February 13, 2024
We cannot build a better future unless we can envision it. EFF’s How to Fix the Internet returns with another season full of inspiring conversations with some of the smartest and most interesting people around who are thinking about how to make the internet – and the world – a better place for all of us. Co-hosts Executive Director Cindy Cohn and Activism Director Jason Kelley will speak with people like journalist Kashmir Hill, Taiwan’s minister of digital affairs Audrey Tang, former White House advisor Tim Wu, digital artist Dr. Nettrice Gaskins and actor and filmmaker Alex Winter, among others. It seems like everywhere we turn we see dystopian stories about technology’s impact on our lives and our futures — from tracking-based surveillance capitalism to street level government surveillance to the dominance of a few large platforms choking innovation to the growing pressure by authoritarian governments to control what we see and say — the landscape can feel bleak. Exposing and articulating these problems is important, but so is envisioning and then building a better future. That’s where our podcast comes in. EFF's How to Fix the Internet podcast offers a better way forward. Through curious conversations with some of the leading minds in law and technology, we explore creative solutions to some of today’s biggest tech challenges.
Bonus · Mon, December 11, 2023
We are hard at work on our fifth season of How to Fix the Internet, and we are so excited to be bringing you insightful and inspiring conversations with some of the smartest and most interesting people around who are thinking about how to make the internet – and the world – a better place for all of us. People like journalist Kashmir Hill, Taiwan’s minister of digital affairs Audrey Tang, former White House advisor Tim Wu, and actor and filmmaker Alex Winter. Plus, we were thrilled to learn that How to Fix the Internet is a finalist for the Anthem Awards in the responsible tech category. The Anthem Awards are companions to the well known Webby Awards, which we were nominated for way back in 2004, but did not win. Winning an Anthem Award will absolutely help us reach more ears, and more corners of the internet. If you enjoy the show, please go to www.eff.org/anthem , scroll down until you see EFF, and click “CELEBRATE” to vote for How to Fix the Internet. Thanks so much for your support, and stay tuned for our new season, coming soon!
Bonus · Wed, August 30, 2023
This episode was first published on May 24, 2022. Pam Smith has been working to secure US elections for years, and now as the CEO of Verified Voting, she has some important ideas about the role the internet plays in American democracy. Pam joins Cindy and Danny to explain how elections can be more transparent and more engaging for all. U.S. democracy is at an inflection point, and how we administer and verify our elections is more important than ever. From hanging chads to glitchy touchscreens to partisan disinformation, too many Americans worry that their votes won’t count and that election results aren’t trustworthy. It’s crucial that citizens have well-justified confidence in this pillar of our republic. Technology can provide answers - but that doesn’t mean moving elections online. As president and CEO of the nonpartisan nonprofit Verified Voting, Pamela Smith helps lead the national fight to balance ballot accessibility with ballot security by advocating for paper trails, audits, and transparency wherever and however Americans cast votes. On this episode of How to Fix the Internet, Pamela Smith joins EFF’s Cindy Cohn and Danny O’Brien to discuss hope for the future of democracy and the technology and best practices that will get us there. In this episode you’ll learn about: Why voting online can never be like banking or shopping online What a “risk-limiting audit” is, and why no election should lack it Whether open-source software could be part of securing our votes Where to find reliable information about how your elections are conducted Pamela Smith, President & CEO of Verified Voting, plays a national leadership role in safeguarding elections and building working alliances between advocates, election officials, and other stakeholders. Pam joined Verified Voting in 2004, and previously served as President from 2007-2017. She is a member of the National Task Force on Election Crises, a diverse cross-partisan group of more than 50 experts whose mission is to prevent and mitigate election crises by urging critical reforms. She provides information and public testimony on election security issues across the nation, including to Congress. Before her work in elections, she was a nonprofit executive for a Hispanic educational organization working on first language literacy and adult learning, and a small business and marketing consultant. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/Skill_Borrow
S4 E10 · Tue, May 30, 2023
Writers sit watching a stranger’s search engine terms being typed in real time, a voyeuristic peek into that person’s most private thoughts. A woman lands a dream job at a powerful tech company but uncovers an agenda affecting the lives of all of humanity. An app developer keeps pitching the craziest, most harmful ideas she can imagine but the tech mega-monopoly she works for keeps adopting them, to worldwide delight. The first instance of deep online creepiness actually happened to Dave Eggers almost 30 years ago. The latter two are plots of two of Eggers’ many bestselling novels—“The Circle” and “The Every,” respectively—inspired by the author’s continuing rumination on how much is too much on the internet. He believes we should live intentionally, using technology when it makes sense but otherwise logging off and living an analog, grounded life. Eggers — whose newest novel, “The Eyes and the Impossible,” was published this month — speaks with EFF’s Cindy Cohn and Jason Kelley about why he hates Zoom so much, how and why we get sucked into digital worlds despite our own best interests, and painting the darkest version of our future so that we can steer away from it. In this episode, you’ll learn about: How that three-digit credit score that you keep striving to improve symbolizes a big problem with modern tech. The difficulties of distributing books without using Amazon. Why round-the-clock surveillance by schools, parents, and others can be harmful to kids. The vital importance of letting yourself be bored and unstructured sometimes. Dave Eggers is the bestselling author of his memoir “ A Heartbreaking Work of Staggering Genius ” (2000) as well as novels including “ What Is the What ” (2006), “ A Hologram for the King ” (2012), “ The Circle ” (2013), and “ The Every ” (2021); his latest novel, “ The Eyes and the Impossible ,” was published May 9. He founded the independent publishing company McSweeney’s as well as its namesake daily humor website, and he co-founded 826 Valencia , a nonprofit youth writing center that has inspired over 70 similar organizations worldwide. Eggers is winner of the American Book Award , the Muhammad Ali Humanitarian Award for Education , the Dayton Literary Peace Prize , and the<a href="h
S4 E9 · Tue, May 16, 2023
People with disabilities were the original hackers. The world can feel closed to them, so they often have had to be self-reliant in how they interact with society. And that creativity and ingenuity is an unappreciated resource. Henry Claypool has been an observer and champion of that resource for decades, both in government and in the nonprofit sector. He’s a national policy expert and consultant specializing in both disability policy and technology policy, particularly where they intersect. He knows real harm can result from misuse of technology, intentionally or not, and people with disabilities frequently end up at the bottom of the list on inclusion. Claypool joins EFF’s Cindy Cohn and Jason Kelley to talk about motivating tech developers to involve disabled people in creating a world where people who function differently have a smooth transition into any forum and can engage with a wide variety of audiences, a seamless inclusion in the full human experience. In this episode, you’ll learn about: How accessibility asks, “Can we knock on the door?” while inclusion says, ”Let’s build a house that already has all of us inside it.” Why affordable broadband programs must include disability-related costs. Why disability inclusion discussions must involve intersectional voices such people of color and the LGBTQI+ community. How algorithms and artificial intelligence used in everything from hiring tools to social services platforms too often produce results skewed against people with disabilities. Henry Claypool is a technology policy consultant and former executive vice president at the American Association of People with Disabilities , which promotes equal opportunity, economic power, independent living and political participation for people with disabilities. He is the former director of the U.S. Health and Human Services Office on Disability and a founding principal deputy administrator of the Administration for Community Living . He was appointed by President Barack Obama to the Federal Commission on Long-Term Care , advising Congress on how long-term care can be better provided and financed for the nation’s older adults and people with disabilities, now and in the future. He is a visiting scientist with the Lurie Center for Disability Policy in the Heller School for Social Policy and Management at Brandeis University, and principal of Claypool Consulting. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 I
S4 E8 · Tue, May 02, 2023
Dr. Seuss wrote a story about a Hawtch-Hawtcher Bee-Watcher whose job it is to watch his town’s one lazy bee, because “a bee that is watched will work harder, you see.” But that doesn’t seem to work, so another Hawtch-Hawtcher is assigned to watch the first, and then another to watch the second... until the whole town is watching each other watch a bee. To Federal Trade Commissioner Alvaro Bedoya, the story—which long predates the internet—is a great metaphor for why we must be wary of workplace surveillance, and why we need to strengthen our privacy laws. Bedoya has made a career of studying privacy, trust, and competition, and wishes for a world in which we can do, see, and read what we want, living our lives without being held back by our identity, income, faith, or any other attribute. In that world, all our interactions with technology —from social media to job or mortgage applications—are on a level playing field. Bedoya speaks with EFF’s Cindy Cohn and Jason Kelley about how fixing the internet should allow all people to live their lives with dignity, pride, and purpose. In this episode, you’ll learn about: The nuances of work that “bossware,” employee surveillance technology, can’t catch. Why the Health Insurance Portability Accountability Act (HIPAA) isn’t the privacy panacea you might think it is. Making sure that one-size-fits-all privacy rules don’t backfire against new entrants and small competitors. How antitrust fundamentally is about small competitors and working people, like laborers and farmers, deserving fairness in our economy. Alvaro Bedoya was nominated by President Joe Biden, confirmed by the U.S. Senate, and sworn in May 16, 2022 as a Commissioner of the Federal Trade Commission ; his term expires in September 2026. Bedoya was the founding director of the Center on Privacy & Technology at Georgetown University Law Center , where he was also a visiting professor of law. He has been influential in research and policy at the intersection of privacy and civil rights, and co-authored a 2016 report on the use of facial recognition by law enforcement and the risks that it poses. He previously served as the first Chief Counsel to the Senate Judiciary Subcommittee on Privacy, Technology and the Law after its founding in 2011, and as Chief Counsel to former U.S. Sen. Al Franken (D-MN); earlier, he was an associate at the law firm WilmerHale . A naturalized immigrant born in Peru and raised in upstate New York, Bedoya previously co-founded the Esperanza Education Fund , a college scholarship for immigrant students in the District of Columbia, Maryland, and Vi
S4 E7 · Tue, April 18, 2023
An internet that is safe for sex workers is an internet that is safer for everyone. Though the effects of stigmatization and criminalization run deep, the sex worker community exemplifies how technology can help people reduce harm, share support, and offer experienced analysis to protect each other. But a 2018 federal law purportedly aimed at stopping sex trafficking, FOSTA-SESTA , led to shutdowns of online spaces where sex workers could talk, putting at increased risk some of the very people it was supposed to protect. Public interest technology lawyer Kendra Albert and sex worker, activist, and researcher Danielle Blunt have been fighting for sex workers’ online rights for years. They say that this marginalized group’s experience can be a valuable model for protecting all of our free speech rights, and that holding online platforms legally responsible for user speech can lead to censorship that hurts us all. Albert and Blunt join EFF’s Cindy Cohn and Jason Kelley to talk about the failures of FOSTA-SESTA, the need for encryption to create a safe internet, and how to create cross-movement relationships with other activists for bodily autonomy so that all internet users can continue to build online communities that keep them safe and free. In this episode, you’ll learn about: How criminalization sometimes harms those whom it is meant to protect. How end-to-end encryption goes hand-in-hand with shared community wisdom to protect speech about things that are—or might ever be—criminalized. Viewing community building, mutual aid, and organizing as a kind of technology. The importance of centering those likely to be impacted in conversations about policy solutions. Kendra Albert is a public interest technology lawyer with a special interest in computer security law and freedom of expression. They serve as a clinical instructor at the Cyberlaw Clinic at Harvard Law School , where they teach students to practice law by working with pro bono clients; they also founded and direct the Initiative for a Representative First Amendment . They serve on the boards of the ACLU of Massachusetts and the Tor Project , and provide support as a legal advisor for Hacking//Hustling . They earned a B.H.A. in History and Lighting Design from Carnegie Mellon University and a J.D. from Harvard Law School, cum laude. Danielle Blunt is a sex worker, community organizer, public health researcher and co-founder of Hacking//Hustling , a collective of
S4 E6 · Tue, April 04, 2023
When a science-fiction villain is defeated, we often see the heroes take their victory lap and then everyone lives happily ever after. But that’s not how real struggles work: In real life, victories are followed by repairs, rebuilding, and reparations, by analysis and introspection, and often, by new battles. Science-fiction author and science journalist Annalee Newitz knows social change is a neverending process, and revolutions are long and sometimes kind of boring. Their novels and nonfiction books, however, are anything but boring—they write dynamically about the future we actually want and can attain, not an idealized and unattainable daydream. They’re involved in a project called “We Will Rise Again:” an anthology pairing science fiction writers with activists to envision realistically how we can do things better as a neighborhood, a community, or a civilization. Newitz speaks with EFF’s Cindy Cohn and Jason Kelley about depicting true progress as a long-haul endeavor, understanding that failure is part of the process, and creating good law as a form of world-building and improving our future. In this episode, you’ll learn about: Why the Star Wars series “Andor” is a good depiction of the brutal, draining nature of engaging in protracted action against a repressive regime. The nature of the “hopepunk” genre, and how it acknowledges that things are tough and one small victory is not the end of oppression. How alien, animal, and artificial characters in fiction can help us examine and improve upon human relationships and how we use our resources. How re-thinking our allocation and protection of physical and intellectual property could bring about a more just future. Annalee Newitz writes science fiction and nonfiction. Their new novel—“ The Terraformers ” (2023)—led Scientific American to comment, ‘It’s easy to imagine future generations studying this novel as a primer for how to embrace solutions to the challenges we all face." Their first novel—”Autonomous” (2017)—won the Lambda Literary Award . As a science journalist, they are the author of “ Four Lost Cities: A Secret History of the Urban Age ” (2021) and “ Scatter, Adapt and Remember: How Humans Will Survive a Mass Extinction ” (2013), which was a finalist for the LA Times Book Prize in science. They are a writer for the New York Times ; have a month
S4 E5 · Tue, March 21, 2023
The promise of the internet was that it would be a tool to melt barriers and aid truth-seekers everywhere. But it feels like polarization has worsened in recent years, and more internet users are being misled into embracing conspiracies and cults. From QAnon to anti-vax screeds to talk of an Illuminati bunker beneath Denver International Airport, Alice Marwick has heard it all. She has spent years researching some dark corners of the online experience: the spread of conspiracy theories and disinformation. She says many people see conspiracy theories as participatory ways to be active in political and social systems from which they feel left out, building upon beliefs they already harbor to weave intricate and entirely false narratives. Marwick speaks with EFF’s Cindy Cohn and Jason Kelley about finding ways to identify and leverage people’s commonalities to stem this flood of disinformation while ensuring that the most marginalized and vulnerable internet users are still empowered to speak out. In this episode you’ll learn about: Why seemingly ludicrous conspiracy theories get so many views and followers How disinformation is tied to personal identity and feelings of marginalization and disenfranchisement When fact-checking does and doesn’t work Thinking about online privacy as a political and structural issue rather than something that can be solved by individual action Alice Marwick is an Associate Professor in the Department of Communication and cofounder and Principal Researcher at the Center for Information, Technology and Public Life at the University of North Carolina, Chapel Hill. She researches the social, political, and cultural implications of popular social media technologies. In 2017, she co-authored Media Manipulation and Disinformation Online (Data & Society), a flagship report examining far-right online subcultures’ use of social media to spread disinformation, for which she was named one of Foreign Policy magazine’s 2017 Global Thinkers . She is the author of Status Update: Celebrity, Publicity and Branding in the Social Media Age (Yale 2013), an ethnographic study of the San Francisco tech scene which examines how people seek social status through online visibility, and co-editor of The Sage Handbook of Social Media (Sage 2017). Her forthcoming book, The Private is Political (Yale 2023), examines how the networked nature of o
S4 E4 · Tue, March 07, 2023
What would the internet look like if it weren't the greatest technology of mass surveillance in the history of mankind? Trevor Paglen wonders about this, and he makes art from it. To Paglen, art is a conversation with the past and the future – artifacts of how the world looks at a certain time and place. In our time and place, it’s a world dogged by digital privacy concerns, and so his art ranges from 19th-century style photos of military drones circling like insects in the Nevada sky, to a museum installation that provides a free wifi hotspot offering anonymized browsing through a Tor network, to deep-sea diving photos of internet cables tapped by the National Security Agency. Paglen speaks with EFF's Cindy Cohn and Jason Kelley about making the invisible visible: creating physical manifestations of the data collection and artificial intelligence that characterize today’s internet so that people can reflect on how to make tomorrow’s internet far better for us all. In this episode you’ll learn about: The blurred edges between art, law, and activism in creating spaces for people to think differently. Exploring the contradictions of technology that is both beautiful and scary. Creating an artistic vocabulary and culture that helps viewers grasp technical and political issues. Changing the attitude that technology is neutral, and instead illuminating and mitigating its impacts on society. Trevor Paglen is an artist whose work spans image-making, sculpture, investigative journalism, writing, engineering, and numerous other disciplines with a focus on mass surveillance, data collection, and artificial intelligence. He has had one-person exhibitions at the Smithsonian Museum of American Art in Washington D.C.; the Carnegie Museum of Art in Pittsburgh; the Fondazione Prada in Milan; the Barbican Centre in London; the Vienna Secession in Vienna; and Protocinema in Istanbul . He has launched an artwork into Earth orbit , contributed research and cinematography to the Academy Award-winning film “ Citizenfour ,” and created a radioactive public sculpture for the exclusion zone in Fukushima, Japan. The author of several books and numerous articles, he won a 2017 MacArthur Fellowship “genius grant
S4 E3 · Tue, February 21, 2023
Too often we let the rich and powerful dictate what technology’s future will be, from Mark Zuckerberg’s Metaverse to Elon Musk’s neural implants. But what if we all were empowered to use our voices and perspectives to imagine a better world in which we all can thrive while creating and using technology as we choose? That idea guides Deji Bryce Olukotun’s work both as a critically acclaimed author and as a tech company’s social impact chief. Instead of just envisioning the oligarch-dominated dystopia we fear, he believes speculative fiction can instead paint a picture of healthy, open societies in which all share in technology’s economic bounty. It can also help to free people’s imaginations to envision more competitive, level playing fields. Then we can use those diverse visions to guide policy solutions, from antitrust enforcement to knocking down the laws that stymie innovation. Olukotun speaks with EFF’s Cindy Cohn and Jason Kelley about rejecting the inevitability of the tech future that profit-driven corporate figureheads describe, and choosing instead to exercise the right to imagine our own future and leverage that vision into action. In this episode you’ll learn about: The influence of George W. Bush’s presidency and Silicon Valley’s rapid expansion on Olukotun’s seminal “Nigerians in Space.” The value in envisioning a “post-scarcity” world. Using speculative fiction to more accurately portray the long, complicated arc of civil liberties battles. The importance of stakeholder-based activism in advancing solutions to critical issues from protecting democracy to combating climate change. Deji Bryce Olukotun is the author of two novels and his fiction has appeared in five book collections. His novel “ After the Flare ” won the 2018 Philip K. Dick special citation and was chosen as one of the best books of 2017 by The Guardian, The Washington Post, Syfy.com, Tor.com, Kirkus Reviews, among others. A former Future Tense Fellow at New America, Olukotun is Head of Social Impact at Sonos , leading the audio technology company’s grantmaking and social activations. He previously worked at the digital rights organization Access Now , where he drove campaigns on fighting internet shutdowns, cybersecurity, and online censorship. Olukotun graduated from Yale College and Stanford Law School, and earned a Master’s in creative writing at the University of Cape Town. If you have any feedback on this episode, please email podcast@eff.org . Please visit the site page at eff.org/pod303 wher
S4 E2 · Tue, February 07, 2023
When a tech company moves to your city, the effects ripple far beyond just the people it employs. It can impact thousands of ancillary jobs – from teachers to nurses to construction workers – as well as the community’s housing, transportation, health care, and other businesses. And too often, these impacts can be negative. Catherine Bracy, co-founder and CEO of the Oakland-based TechEquity Collaborative, has spent her career exploring ways to build a more equitable tech-driven economy. She believes that because the technology sector became a major economic driver at the same time deregulation became politically fashionable, tech companies often didn’t catch the “civic bug” – a sense of responsibility to the communities in which they’re based – in the way that industries of the past might have. Bracy speaks with EFF's Cindy Cohn and Jason Kelley about following the money and changing the regulations that underpin the tech sector so that companies are more inclined to be thoughtful about supporting, not exploiting, the places and people they call home – creating stronger, thriving communities. In this episode you’ll learn about: How the venture capital model of funding contributes to tech’s reticence on civic engagement. How the “platform mentality” affects non-tech workers and their communities. Why the law should treat tech companies the same as other companies, without special carve-out exceptions and exemptions. Why tech workers being well-informed about their companies’ and products’ impacts, as well as taking active roles in their communities, can be a game-changer. Catherine Bracy is a civic technologist and community organizer whose work focuses on the intersection of technology and political and economic inequality. She is the co-founder and CEO of TechEquity Collaborative , an organization based in Oakland, CA, that mobilizes tech workers and companies to advocate for economic equity in our communities. She was previously Code for America ’s Senior Director of Partnerships and Ecosystem, where she grew the Brigade program into a network of over 50,000 civic tech volunteers in more than 80 U.S. cities. She also founded Code for All , the global network of Code-for organizations with partners on six continents. During the 2012 election cycle she was Director of Obama for America’s Technology Field Office in San Francisco, the first of its kind in American political history. Earlier, she was administrative director of the Berkman Center for Internet & Society at Harvard Law School. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. If you have any feedback on this episode, please email <a href="https://www.eff.org/how-to
S4 E1 · Tue, January 24, 2023
What can a bustling electronic components bazaar in Shenzhen, China, tell us about building a better technology future? To researcher and hacker Andrew “bunnie” Huang, it symbolizes the boundless motivation, excitement, and innovation that can be unlocked if people have the rights to repair, tinker, and create. Huang believes that to truly unleash innovation that betters everyone, we must replace our current patent and copyright culture with one that truly values making products better, cheaper, and more reliably by encouraging competition around production, quality, and cost optimization. He wants to remind people of the fun, inspiring era when makers didn’t have to live in fear of patent trolls, and to encourage them to demand a return of the “permissionless ecosystem” that nurtured so many great ideas. Huang speaks with EFF's Cindy Cohn and Jason Kelley about how we can have it all – from better phones to cooler drones, from handy medical devices to fun Star Wars fan gadgets – if we’re willing to share ideas and trade short-term profit for long-term advancement. In this episode you’ll learn about: How “rent-seeking behavior” stifles innovation. Why questioning authority and “poking the tigers” of patent law is necessary to move things forward. What China can teach the United States about competitive production that advances creative invention. How uniting hardware and software hackers, fan fiction creators, farmers who want to repair their tractors, and other stakeholders into a single, focused right-to-repair movement could change the future of technology. Andrew “bunnie” Huang is an American security researcher and hardware hacker with a long history in reverse engineering. He's the author of the widely respected 2003 book, “ Hacking the Xbox: An Introduction to Reverse Engineering ,” and since then he served as a research affiliate for the Massachusetts Institute of Technology Media Lab and as a technical advisor for several hardware startups. EFF awarded him a Pioneer Award in 2012 for his work in hardware hacking, open source, and activism. He’s a native of Kalamazoo, MI, he holds a Ph.D. in electrical engineering from MIT, and he lives in Singapore. If you have any feedback on this episode, please email podcast@eff.org . Please visit the site page at https://eff.org/pod301 Find the podcast via RSS , Stitcher , <a href="https://tunein.com/podcasts/Technology-Podcasts/EFFs-How-to-Fix-the-Internet-p1382036/
Trailer · Mon, January 09, 2023
It seems like everywhere we turn we see dystopian stories about technology’s impact on our lives and our futures — from tracking-based surveillance capitalism to street level government surveillance to the dominance of a few large platforms choking innovation to the growing pressure by authoritarian governments to control what we see and say — the landscape can feel bleak. Exposing and articulating these problems is important, but so is envisioning and then building a better future. That’s where our podcast comes in. EFF's How to Fix the Internet podcast offers a better way forward. Through curious conversations with some of the leading minds in law and technology, we explore creative solutions to some of today’s biggest tech challenges. Find the podcast via RSS , Stitcher , TuneIn , Apple Podcasts , Google Podcasts , and Spotify . You can find an MP3 archive of all our episodes at the Internet Archive . Theme music by Nat Keefe of BeatMower. EFF is deeply grateful for the support of the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology, without whom this podcast would not be possible.
Trailer · Wed, November 09, 2022
It seems like everywhere we turn we see dystopian stories about technology’s impact on our lives and our futures — from tracking-based surveillance capitalism to street level government surveillance to the dominance of a few large platforms choking innovation to the growing pressure by authoritarian governments to control what we see and say — the landscape can feel bleak. Exposing and articulating these problems is important, but so is envisioning and then building a better future. That’s where our podcast comes in. EFF's How to Fix the Internet podcast offers a better way forward. Through curious conversations with some of the leading minds in law and technology, we explore creative solutions to some of today’s biggest tech challenges. After tens of thousands of listeners tuned in for our pilot mini-series last year, we are continuing the conversation by launching a full season. Listen today to become deeply informed on vital technology issues and join the movement working to build a better technological future. Find the podcast via RSS , Stitcher , TuneIn , Apple Podcasts , Google Podcasts , and Spotify . You can find an MP3 archive of all our episodes at the Internet Archive . Theme music by Nat Keefe of BeatMower. EFF is deeply grateful for the support of the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology, without whom this podcast would not be possible.
S3 E10 · Tue, May 31, 2022
Where is the internet we were promised? It feels like we’re dominated by megalithic, siloed platforms where users have little or no say over how their data is used and little recourse if they disagree, where direct interaction with users is seen as a bug to be fixed, and where art and creativity are just “content generation.” But take a peek beyond those platforms and you can still find a thriving internet of millions who are empowered to control their own technology, art, and lives. Anil Dash, CEO of Glitch and an EFF board member, says this is where we start reclaiming the internet for individual agency, control, creativity, and connection to culture - especially among society’s most vulnerable and marginalized members. Dash speaks with EFF's Cindy Cohn and Danny O’Brien about building more humane and inclusive technology, and leveraging love of art and culture into grassroots movements for an internet that truly belongs to us all. In this episode you’ll learn about: What past and current social justice movements can teach us about reclaiming the internet The importance of clearly understanding and describing what we want—and don’t want—from technology Energizing people in artistic and fandom communities to become activists for better technology Tech workers’ potential power over what their employers do How Wordle might be a window into a healthier web. If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod210 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/djlang59/61577 Get It - pop mix by J.Lang Feat: AnalogByNature & RJay http://dig.ccmixter.org/files/djlang59/59729 Probably Shouldn't by J.Lang http://dig.ccmixter.org/files/JeffSpeed68/56377 Smokey Eyes by Stefan Kartenberg http://dig.ccmixter.org/files/airtone/58703 commonGround by airtone http://dig.ccmixter.org/files/Skill_Borrower/41751 Klaus by Skill_Borrower <a href="http://dig
S3 E9 · Tue, May 24, 2022
U.S. democracy is at an inflection point, and how we administer and verify our elections is more important than ever. From hanging chads to glitchy touchscreens to partisan disinformation, too many Americans worry that their votes won’t count and that election results aren’t trustworthy. It’s crucial that citizens have well-justified confidence in this pillar of our republic. Technology can provide answers - but that doesn’t mean moving elections online. As president and CEO of the nonpartisan nonprofit Verified Voting, Pamela Smith helps lead the national fight to balance ballot accessibility with ballot security by advocating for paper trails, audits, and transparency wherever and however Americans cast votes. On this episode of How to Fix the Internet, Pamela Smith joins EFF’s Cindy Cohn and Danny O’Brien to discuss hope for the future of democracy and the technology and best practices that will get us there. In this episode you’ll learn about: Why voting online can never be like banking or shopping online What a “risk-limiting audit” is, and why no election should lack it Whether open-source software could be part of securing our votes Where to find reliable information about how your elections are conducted If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod209 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. Pamela Smith, President & CEO of Verified Voting, plays a national leadership role in safeguarding elections and building working alliances between advocates, election officials, and other stakeholders. Pam joined Verified Voting in 2004, and previously served as President from 2007-2017. She is a member of the National Task Force on Election Crises, a diverse cross-partisan group of more than 50 experts whose mission is to prevent and mitigate election crises by urging critical reforms. She provides information and public testimony on election security issues across the nation, including to Congress. Before her work in elections, she was a nonprofit executive for a Hispanic educational organization working on first language literacy and adult learning, and a small business and marketing consultant. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/Skill_Borrower/41751 Klaus by Skill_Borrower <a
S3 E8 · Tue, May 17, 2022
It often feels like machine learning experts are running around with a hammer, looking at everything as a potential nail - they have a system that does cool things and is fun to work on, and they go in search of things to use it for. But what if we flip that around and start by working with people in various fields - education, health, or economics, for example - to clearly define societal problems, and then design algorithms providing useful steps to solve them? Rediet Abebe, a researcher and professor of computer science at UC Berkeley, spends a lot of time thinking about how machine learning functions in the real world, and working to make the results of machine learning processes more actionable and more equitable. Abebe joins EFF's Cindy Cohn and Danny O’Brien to discuss how we redefine the machine learning pipeline - from creating a more diverse pool of computer scientists to rethinking how we apply this tech for the betterment of society’s most marginalized and vulnerable - to make real, positive change in people’s lives. In this episode you’ll learn about: The historical problems with the official U.S. poverty measurement How machine learning can (and can’t) lead to more just verdicts in our criminal courts How equitable data sharing practices could help nations and cultures around the world Reconsidering machine learning’s variables to maximize for goals other than commercial profit. If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod208 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/djlang59/59729 Probably Shouldn't by J.Lang http://dig.ccmixter.org/files/Skill_Borrower/41751 Klaus by Skill_Borrower http://dig.ccmixter.org/files/airtone/58703 commonGround by airtone http://dig.ccmixter.org/files/JeffSpeed68/56377 Smokey Eyes by Stefan Kartenberg http://dig.ccmixter.org/files/NiGiD/62475 Chrome Cactus by Martijn de Boer (NiGiD)
S3 E7 · Tue, May 10, 2022
Computer scientists often build algorithms with a keen focus on “solving the problem,” without considering the larger implications and potential misuses of the technology they’re creating. That’s how we wind up with machine learning that prevents qualified job applicants from advancing, or blocks mortgage applicants from buying homes, or creates miscarriages of justice in parole and other aspects of the criminal justice system. James Mickens—a lifelong hacker, perennial wisecracker, and would-be philosopher-king who also happens to be a Harvard University professor of computer science—says we must educate computer scientists to consider the bigger picture early in their creative process. In a world where much of what we do each day involves computers of one sort or another, the process of creating technology must take into account the society it’s meant to serve, including the most vulnerable. Mickens speaks with EFF's Cindy Cohn and Danny O’Brien about some of the problems inherent in educating computer scientists, and how fixing those problems might help us fix the internet. In this episode you’ll learn about: Why it’s important to include non-engineering voices, from historians and sociologists to people from marginalized communities, in the engineering process The need to balance paying down our “tech debt” —cleaning up the messy, haphazard systems of yesteryear—with innovating new technologies How to embed ethics education within computer engineering curricula so students can identify and overcome challenges before they’re encoded into new systems Fostering transparency about how and by whom your data is used, and for whose profit What we can learn from Søren Kierkegaard and Stan Lee about personal responsibility in technology. If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod207 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/djlang59/59729 Probably Shouldn't by J.Lang (c) copyright 2019 http://dig.ccmixter.org/files/airtone/58703 commonGround by airtone (c) copyright 2018 http://dig.ccmixter.org/files/mwic/58883 Xe
S3 E6 · Tue, May 03, 2022
Too many young people – particularly young people of color – lack enough familiarity or experience with emerging technologies to recognize how artificial intelligence can impact their lives, in either a harmful or an empowering way. Educator Ora Tanner saw this and rededicated her career toward promoting tech literacy and changing how we understand data sharing and surveillance, as well as teaching how AI can be both a dangerous tool and a powerful one for innovation and activism. By now her curricula have touched more than 30,000 students, many of them in her home state of Florida. Tanner also went to bat against the Florida Schools Safety Portal, a project to amass enormous amounts of data about students in an effort to predict and avert school shootings – and a proposal rife with potential biases and abuses. Tanner speaks with EFF's Cindy Cohn and Jason Kelley on teaching young people about the algorithms that surround them, and how they can make themselves heard to build a fairer, brighter tech future. In this episode you’ll learn about: Convincing policymakers that AI and other potentially invasive tech isn’t always the answer to solving public safety problems. Bringing diverse new voices into the dialogue about how AI is designed and used. Creating a culture of searching for truth rather than just accepting whatever information is put on your plate. Empowering disadvantaged communities not only through tech literacy but by teaching informed activism as well. If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod206 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: Meet Me at Phountain by gaetanh (c) copyright 2022 http://ccmixter.org/files/gaetanh/64711 Hoedown at the Roundabout by gaetanh (c) copyright 2022 http://ccmixter.org/files/gaetanh/64711 JPEG of a Hotdog by gaetanh (c) copyright 2022 http://ccmixter.org/files/gaetanh/64711 reCreation by airtone (c) copyright 2019 http://dig.ccmixter.org/files/airtone/59721
S3 E5 · Tue, April 12, 2022
The joy of tinkering, making, and sharing is part of the human condition. In modern times, this creative freedom too often is stifled by secrecy as a means of monetization - from non-compete laws to quashing people’s right to repair the products they’ve already paid for. Adam Savage—the maker extraordinaire best known from the television shows MythBusters and Savage Builds —is an outspoken advocate for the right to repair, to tinker, and to put creativity and innovation to work in your own garage. He says a fear-based approach to invention, in which everyone thinks secrecy is the path to a big payday, is exhausting and counterproductive. Savage speaks with EFF's Cindy Cohn and Danny O’Brien about creating a world in which we incrementally keep building on each others’ work, keep iterating the old into new, and keep making things better through collaboration. In this episode you’ll learn about: How cosplay symbolizes what’s best about the instincts to make and share Why it’s better to live in the Star Trek universe than the Star Wars universe Balancing the desire for profit with wide dissemination of ideas that benefit society and culture Building a movement to encourage more people to be makers - and getting the law out of the way If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod205 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: JPEG of a Hotdog by gaetanh http://dig.ccmixter.org/files/gaetanh/6471 Tall Glass of Turnip Juice by gaetanh http://dig.ccmixter.org/files/gaetanh/6471 Gone for Smokes by gaetanh http://dig.ccmixter.org/files/gaetanh/6471 Declan’s Dipsy Doodle by gaetanh http://dig.ccmixter.org/files/gaetanh/6471 Whose Hand is That by gaetanh http://dig.ccmixter.org/files/gaetanh/6471
S3 E4 · Tue, April 05, 2022
Democracy means allowing everyday people to have their voices heard on public matters involving their communities. One of the goals of civic technology is to allow a more diverse group of people to have input on government affairs through the use of technology and the internet. Beth Noveck, author of Solving Public Problems and Director of the Governance Lab, chats with EFF's Cindy Cohn and Danny O'Brien about how civic technology can enhance people's relationship with the government and help improve their communities. In this episode you’ll learn about: What civic technology is and how it can be used to approach and fix public problems while enhancing the relationship between people and their government. The importance of deciding what problem you are trying to solve before working on a solution. Ways that civic technology can ensure that the government is held accountable for its actions. How we can build civic technology tools to increase inclusion, specifically for those who have been marginalized or previously left out of the conversation. Why civic technology allows for more people to get engaged in their democracy. The good and bad that can come with governments increasing their knowledge of technology. If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod204 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/djlang59/37792 Drops of H2O (The Filtered Water Treatment ) by J.Lang Ft: Airtone http://dig.ccmixter.org/files/mwic/58883 Xena's Kiss / Medea's Kiss by mwic http://dig.ccmixter.org/files/AlexBeroza/59612 Kalte Ohren by Alex Ft: starfrosch & Jerry Spoon http://dig.ccmixter.org/files/snowflake/59564rr4 Come Inside by Snowflake Ft: Starfrosch, Jerry Spoon, Kara Square, spinningmerkaba http://dig.ccmixter.org/files/zep_hurme/59681 Come Inside by Zep Hurme Ft: snowflake
S3 E3 · Tue, March 29, 2022
Today almost everything is connected to the internet - from your coffeemaker to your car to your thermostat. But the “Internet of Things” may not be hardwired for security. Window Snyder, computer security expert and author, joins EFF hosts Cindy Cohn and Danny O’Brien as they delve into the scary insecurities lurking in so many of our modern conveniences—and how we can change policies and tech to improve our security and safety. Window Snyder is the founder and CEO of Thistle Technologies. She’s the former Chief Security Officer of Square, Fastly and Mozilla, and she spent five years at Apple focusing on privacy strategy and features for OS X and iOS. Window is also the co-author of Threat Modeling , a manual for security architecture analysis in software. In this episode, Window explains why malicious hackers might be interested in getting access to your refrigerator, doorbell, or printer. These basic household electronics can be an entry point for attackers to gain access to other sensitive devices on your network. Some of these devices may themselves store sensitive data, like a printer or the camera in a kid’s bedroom. Unfortunately, many internet-connected devices in your home aren’t designed to be easily inspected and reviewed for inappropriate access. That means it can be hard for you to know whether they’ve been compromised. But the answer is not forswearing all connected devices. Window approaches this problem with some optimism for the future. Software companies have learned, after an onslaught of attacks, to prioritize security. And she covers how we can bring the lessons of software security into the world of hardware devices. In this episode, we explain: How it was the hard costs of addressing security vulnerabilities, rather than the sharp stick of regulation, that pushed many tech companies to start prioritizing cybersecurity. The particular threat of devices that are no longer being updated by the companies that originally deployed them, perhaps because that product is no longer produced, or because the company has folded or been sold. Why we should adapt our best current systems for software security, like our processes for updating browsers and operating systems, for securing newly networked devices, like doorbells and refrigerators. Why committing to a year or two of security updates isn’t good enough when it comes to consumer goods like cars and medical technology. Why it’s important for hardware creators to build devices so that they will be able to reliably update the software without “bricking” the device. The challenge of covering the cost of security updates when a user only pays once for the device – and how bundling security updates with new features can entice users to stay updated. Thi
S3 E2 · Tue, March 22, 2022
Like many young people, Zach Latta went to a school that didn't teach any computer classes. But that didn’t stop him from learning everything he could about them and becoming a programmer at a young age. After moving to San Francisco, Zach founded Hack Club, a nonprofit network of high school coding clubs around the world, to help other students find the education and community that he wished he had as a teenager. This week on our podcast, we talk to Zach about the importance of student access to an open internet, why learning to code can increase equity, and how school's online security and the law often stand in the way. We’ll also discuss how computer education can help create the next generation of makers and builders that we need to solve some of society’s biggest problems. In this episode, you’ll learn about: Why schools block some harmless educational content and coding resources, from common sites like Github to “view source” functions on school-issued devices How locked down digital systems in schools stop young people from learning about coding and computers, and create equity issues for students who are already marginalized How coding and “hack” clubs can empower young people, help them learn self-expression, and find community How pervasive school surveillance undermines trust and limits people’s ability to exercise their rights when they are older How young people’s curiosity for how things work online has helped bring us some of the technology we love most If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at eff.org/pod202 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: Warm Vacuum Tube by Admiral Bob (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. http://dig.ccmixter.org/files/admiralbob77/59533 Ft: starfrosch Drops of H2O ( The Filtered Water Treatment ) by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) license. http://dig.ccmixter.org/files/djlang59/37792 Ft: Airtone reCreation by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. http://dig.ccmixter.org/files/airtone/59721
S3 E1 · Tue, March 15, 2022
Imagine being detained by armed agents whenever you returned from traveling outside the country. That’s what life became like for Academy Award-winning filmmaker Laura Poitras, who was placed on a terrorist watch-list after she made a documentary critical of the U.S. invasion and occupation of Iraq. Poitras was detained close to 100 times between 2006 and 2012, and border agents routinely copied her notebooks and threatened to take her electronics. It was only after Poitras teamed up with EFF to sue the government that she was able to see evidence of the government’s six-year campaign of spying on her. This week on our podcast, Poitras joins EFF’s Cindy Cohn and Danny O’Brien to talk about her continuing work to uncover spying on journalists, and what we can do to fight back against mass surveillance. In this episode you’ll learn about: What life was like for Poitras when she was placed on a terror watch list and put under FBI surveillance Why security is a “team sport,” and what we can all do to protect ourselves as well as more vulnerable people Poitras’ new work about the NSO Group, an Israeli spyware company that has been accused of facilitating human rights abuses worldwide What legal strategies can be used to push back on mass surveillance The role of whistleblowers like Edward Snowden and human rights activists in uncovering spying abuses, and how they can be better protected The laws that we need to protect professional journalists and citizen journalists in an age where anyone can record the news If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod201 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: Come Inside by Snowflake (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. http://dig.ccmixter.org/files/snowflake/59564 Ft: Starfrosch, Jerry Spoon, Kara Square, spinningmerkaba _____ http://dig.ccmixter.org/files/AlexBeroza/59612 Kalte Ohren by Alex (c) copyright 2019 Licensed under a Creative Common
Bonus · Tue, March 08, 2022
Our guest from Season 2, Ethan Zuckerman, has his own podcast: Reimagining the Internet. He had EFF's Jillian York as a guest on his show, and we thought you'd like to have a listen to it.
S2 E10 · Tue, February 01, 2022
Marc Maron is the host of a successful podcast, and when he and some other pioneers started out he didn’t have to think much about the layers of technology he was using, until a patent troll came to call, asking for thousands of dollars to pay for the “rights” to podcasting because of a patent they were mis-using to get money from the nascent podcast world. Marc and his producer Brendan knew that if they didn’t take up the fight to stop the trolls, then all of podcasting would be under threat, so they joined up with some EFF lawyers and a whole lot of listeners to win their fight. In this episode you’ll learn about: The prior art, or evidence of earlier technology that EFF was able to present to courts to prove that the so-called “podcasting patent” was invalid. How the landmark Alice v. CLS Bank Supreme Court decision has helped make patent law better, but still didn’t solve the problem of patent trolls Why patent trolls are a drain on innovation How we should think about which ideas should be building blocks for the public good, and which should be owned Why the community that came together around the podcasting patent fight was critical to EFF’s victory How EFF prevailed when the patent troll tried to get the names of EFF donors. If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod110 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: Warm Vacuum Tube by Admiral Bob (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/admiralbob77/59533 featuring starfrosch Drops of H2O ( The Filtered Water Treatment ) by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/djlang59/37792 Ft: Airtone Xena's Kiss / Medea's Kiss by mwic (c) copyright 2018 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/mwic/58883 reCreation by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/airtone/59721
S2 E9 · Tue, January 25, 2022
What if we re-imagined the internet to be built by more people, in new ways, that actually worked for us as a public good instead of a public harm? Join Ethan Zuckerman in conversation with Cindy Cohn and Danny O’Brien as they fix and reimagine the internet. They’ll talk about what the internet could look like if a diversity of people built their own tools, how advertising could be less creepy, but still work, and how hope in the future will light the way to a better internet. In this episode you’ll learn about: The challenges researchers face when gathering information and data about our relationship with social media platforms. Different ways to communicate with groups online and how these alternatives would improve online speech. Ways that third parties have tried to give more user control in social media platforms. How censorship, and who we worry about censoring speech, has changed as the internet has evolved. The problems with surveillance advertising and alternative ideas for advertisements on the internet. How the Computer Fraud and Abuse Act blocks research and innovation, and how we can fix it. How communication on the internet has changed over time, why social media giants aren’t getting it right, and how to move forward. If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod109 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: Come Inside by Zep Hurme (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/zep_hurme/59681 Ft: snowflake Perspectives *** by J.Lang (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/djlang59/60335 Ft: Sackjo22 and Admiral Bob Xena's Kiss / Medea's Kiss by mwic (c) copyright 2018 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/mwic/58883 Drops of H2O ( The Filtered Water Treatment ) by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/djlang59/37792 Ft: Airtone reCreation by airtone (c) copyright 2019 Licensed un
S2 E8 · Tue, January 18, 2022
Financial transactions reveal so much about us: the causes we support, where we go, what we buy, who we spend time with. Somehow, the mass surveillance of financial transactions has been normalized in the United States, despite the fourth amendment protection in the constitution. But it doesn’t have to be that way, as explained by Marta Belcher, a lawyer and activist in the financial privacy world. Marta offers a deep dive into financial surveillance and censorship. In this episode, you’ll learn about: The concept of the third party doctrine, a court-created idea that law enforcement doesn’t need to get a warrant to access metadata shared with third parties (such as companies that manage communications and banking services); How financial surveillance can have a chilling effect on activist communities, including pro-democracy activists fighting against authoritarian regimes in Hong Kong and elsewhere; How the Bank Secrecy Act means that your bank services are sharing sensitive banking details on customers with the government by default, without any request from law enforcement to prompt it; Why the Bank Secrecy Act as it’s currently interpreted violates the Fourth Amendment; The potential role of blockchain technologies to import some of the privacy-protective features of cash into the digital world; How one recent case missed an opportunity to better protect the data of cryptocurrency users; How financial surveillance is a precursor to financial censorship, in which banking services are restricted for people who haven’t violated the law. If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod108 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: Come Inside by Zep Hurme (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/zep_hurme/59681 Ft: snowflake Perspectives *** by J.Lang (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/djlang59/60335 Ft: Sackjo22 and Admiral Bob Kalte Ohren by Alex (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/AlexBeroza/59612 Ft: starfrosch & J
S2 E7 · Tue, January 11, 2022
One of the supposed promises of AI was that it would be able to take the bias out of human decisions, and maybe even lead to more equity in society. But the reality is that the errors of the past are embedded in the data of today, keeping prejudice and discrimination in. Pair that with surveillance capitalism, and what you get are algorithms that impact the way consumers are treated, from how much they pay for things, to what kinds of ads they are shown, to if a bank will even lend them money. But it doesn’t have to be that way, because the same techniques that prey on people can lift them up. Vinhcent Le from the Greenlining Institute joins Cindy and Danny to talk about how AI can be used to make things easier for people who need a break. In this episode you’ll learn about: Redlining—the pernicious system that denies historically marginalized people access to loans and financial services—and how modern civil rights laws have attempted to ban this practice. How the vast amount of our data collected through modern technology, especially browsing the Web, is often used to target consumers for products, and in effect recreates the illegal practice of redlining. The weaknesses of the consent-based models for safeguarding consumer privacy, which often mean that people are unknowingly waving away their privacy whenever they agree to a website’s terms of service. How the United States currently has an insufficient patchwork of state laws that guard different types of data, and how a federal privacy law is needed to set a floor for basic privacy protections. How we might reimagine machine learning as a tool that actively helps us root out and combat bias in consumer-facing financial services and pricing, rather than exacerbating those problems. The importance of transparency in the algorithms that make decisions about our lives. How we might create technology to help consumers better understand the government services available to them. If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod107 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators Drops of H2O ( The Filtered Water Treatment ) by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/djlang59/37792 Ft: Airtone Come Inside by Zep Hurme (c) copyright 2019 Licensed under a Creat
S2 E6 · Tue, December 21, 2021
Matt Mitchell started Crypto Harlem to teach people in his community about how online and real life surveillance works, and what they could do about it. Through empowering people to understand their online privacy choices, and to speak up for change when their privacy in real life is eroded, Matt is building a movement to make a better future for everyone. In this episode you’ll learn about: Cryptoparties being organized by volunteers to educate people about what surveillance technology looks like, how it works, and who installed it How working within your own community can be an extremely effective (and fun) way to push back against surveillance How historically surveilled communities have borne the brunt of new, digital forms of surveillance The ineffectiveness and bias of much new surveillance technology, and why it’s so hard to “surveill yourself to safety” Why and how heavily surveilled communities are taking back their privacy, sometimes using new technology The ways that Community Control Of Police Surveillance (CCOPS) legislation can benefit communities by offering avenues to learn about and discuss surveillance technology before it’s installed How security and digital privacy has improved, with new options, settings, and applications that offer more control over our online lives If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod106 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: Warm Vacuum Tube by Admiral Bob (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/admiralbob77/59533 Ft: starfrosch Xena's Kiss / Medea's Kiss by mwic (c) copyright 2018 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/mwic/58883 Warm Vacuum Tube by Admiral Bob (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/admiralbob77/59533 Ft: starfrosch reCreation by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/airtone/59721
S2 E5 · Tue, December 14, 2021
We don’t always think about what it means to have the information on our devices stay secure, and it may seem like the locks on our phones are enough to keep our private lives private. But there is increasing pressure from law enforcement to leave a back door open on our encrypted devices. Meanwhile, other government agencies, including consumer protection agencies, want more secure devices. We dive into the nuances of the battle to secure our data and our lives, and consider what the future would be like if we can finally end the “crypto wars” and tackle other problems in society. On this episode, hosts Cindy Cohn and Danny O’Brien are joined by Riana Pfeffercorn from Stanford’s Centre for Internet and Society to talk about device encryption and why it’s important. If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod105 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. In this episode you’ll learn about: Different types of data law enforcement try to gather information from, including “at rest” and “in transit” data. The divide between law enforcement, national security and intelligence communities regarding their stance on strong encryption and backdoors on devices. How the First Amendment plays a role in cryptography and the ability for law enforcement to try to force companies to build certain code into their software. How strong encryption and device security empowers users to voice their thoughts freely. Riana Pfefferkorn is a Research Scholar at the Stanford Internet Observatory. She focuses on investigating and analyzing the U.S. and other governments’ policies and practices for forcing decryption and/or influencing crypto-related design of online platforms and services via technical means and through courts and legislatures. Riana also researches the benefits and detriments of strong encryption on free expression, political engagement, and more. You can find Riana Pfefferkorn on Twitter @ Riana_Crypto . If you have any feedback on this episode, please email podcast@eff.org . You can find a copy of this episode on the Internet Archive. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: Kalte Ohren by Alex (c) copyrig
S2 E4 · Tue, December 07, 2021
There are flaws in the tech we use everyday- from little software glitches to big data breaches, and security researchers often know about them before we do. Getting those issues fixed is not always as straightforward as it should be. It’s not always easy to bend a corporation's ear, and companies may ignore the threat for liability reasons putting us all at risk. Technology and cybersecurity expert Tarah Wheeler joins Cindy Cohn and Danny O’Brien to explain how she thinks security experts can help build a more secure internet. If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod104 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. On this episode, you’ll learn: About the human impact of security vulnerabilities—and how unpatched flaws can change or even end lives; How to reconsider the popular conception of hackers, and understand their role in helping build a more secure digital world; How the Computer Fraud and Abuse Act (CFAA), a law that is supposed to punish computer intrusion, has been written so broadly that it now stifles security researchers; What we can learn from the culture around airplane safety regulation—including transparency and blameless post-mortems; How we can align incentives, including financial incentives, to improve vulnerability reporting and response; How the Supreme Court case Van Buren helped security researchers by ensuring that the CFAA couldn’t be used to prosecute someone for merely violating the terms of service of a website or application; How a better future would involve more collaboration and transparency among both companies and security researchers. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. Resources: Resources Consumer Data Privacy: Equifax Data Breach Update: Backsliding (EFF) EFF’s Recommendations for Consumer Data Privacy Laws (EFF) Strengthen California’s Next Consumer Data Privacy Initiative (EFF) Ransomware: A Hospital Hit by Hackers, a Baby in Distress: The Case of the First Alleged Ransomware Death (WSJ) FAQ: DarkSide Ransomware Group and Colonial Pipel
S2 E3 · Tue, November 30, 2021
The bots that try to moderate speech online are doing a terrible job, and the humans in charge of the biggest tech companies aren’t doing any better. The internet’s promise was as a space where everyone could have their say. But today, just a few platforms get to decide what billions of people see and say online. What’s a better way forward? How can we get back to a world where communities and people decide what’s best for content moderation, rather than tech billionaires or government dictates? Join Daphne Keller, from Stanford’s Centre for the Internet and Society, in conversation with EFF’s Cindy Cohn and Danny O’Brien about a better way to moderate speech online. If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod103 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. In this episode you’ll learn about: — Why giant platforms do a poor job of moderating content —What competitive compatibility (ComCom) is, and how it’s a vital part of the solution to our content moderation puzzle — Why machine learning algorithms won’t be able to figure out who or what a “terrorist” is, and who it’s likely to catch instead — What is the debate over “amplification” of speech, and is it any different than our debate over speech itself? —Why international voices need to be included in discussion about content moderation—and the problems that occur when they’re not —How we could shift towards “bottom-up” content moderation rather than a concentration of power This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators : Come Inside by Zep Hurme (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/zep_hurme/59681 Ft: snowflake Perspectives *** by J.Lang (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/djlang59/60335 Ft: Sackjo22 and Admiral Bob Kalte Ohren by Alex (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/AlexBeroza/59612 Ft: starfrosch & Jerry Spoon Warm Vacuum Tube by Admiral Bob (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.
S2 E2 · Tue, November 23, 2021
Open source software touches every piece of technology that touches our lives- in other words, it’s everywhere. Free software and collaboration is at the heart of every device we rely on, and much of the internet is built from the hard work of people dedicated to the open source dream: ideals that all software should be licenced to be free, modified, distributed and copied without penalty. The movement is growing, and that growth is creating pressure: from too many projects, and not enough resources. The culture is shifting, too, as new people around the world join in and bring different ideas and different dreams for an open source future. James Vasile has been working in open source software for decades, and he joins Cindy Cohn and Danny O’Brien to talk about the challenges that growth is creating, and the opportunities it presents to make open source, and the Internet, even better. If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod102 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. This work is licensed under a Creative Commons Attribution 4.0 International License. Additional music is used under creative commons license from CCMixter includes: Warm Vacuum Tube by Admiral Bob (c) copyright 2019 Licensed under a Creative Commons Attribution 3.0 Unported license. http://dig.ccmixter.org/files/admiralbob77/59533 Ft: starfrosch Come Inside by Snowflake (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/snowflake/59564 Ft: Starfrosch, Jerry Spoon, Kara Square, spinningmerkaba Xena's Kiss / Medea's Kiss by mwic (c) copyright 2018 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/mwic/58883 Drops of H2O ( The Filtered Water Treatment ) by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/djlang59/37792 Ft: Airtone reCreation by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/airtone/59721
S2 E1 · Tue, November 16, 2021
Your phone is a window to your soul - and that window has been left open to law enforcement. Today, even small-town police departments have powerful tools that can easily access the most intimate information on your cell phone. Upturn’s Executive Director Harlan Yu joins EFF hosts Cindy Cohn and Danny O’Brien to talk about a better way for law enforcement to treat our data. When Upturn researchers surveyed police departments on the mobile device forensic tools they were using on mobile phones, they discovered that the tools are being used by police departments large and small across America. There are few rules on what law enforcement can do with the data they download, and not very many policies on how the information should be stored, shared, or destroyed. In this episode you’ll learn about: Mobile device forensic tools (MDFTs) that are used by police to download data from your phone, even when it’s locked How court cases such as Riley v. California powerfully protect our digital privacy-- but those protections are evaded when police get verbal consent to search a phone How widespread the use of MDFTs are by law enforcement departments across the country, including small-town police departments investigating minor infractions The roles that phone manufacturers and mobile device forensic tool vendors can play in protecting user data How re-envisioning our approaches to phone surveillance helps address issues of systemic targeting of marginalized communities by police agencies The role of warrants in protecting our digital data. If you have any feedback on this episode, please email podcast@eff.org. Please visit the site page at https://eff.org/pod101 where you’ll find resources – including links to important legal cases and research discussed in the podcast and a full transcript of the audio. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. This work is licensed under a Creative Commons Attribution 4.0 International License. Additional music is used under creative commons license from CCMixter includes: Warm Vacuum Tube by Admiral Bob (c) copyright 2019 Licensed under a Creative Commons Attribution 3.0 Unported license. http://dig.ccmixter.org/files/admiralbob77/59533 Ft: starfrosch Come Inside by Snowflake (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/snowflake/59564 Ft: Starfrosch, Jerry Spoon, Kara Square, spinningmerkaba Xena's Kiss / Medea's Kiss by mwic (c) copyright 2018 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/mwic/58883<
Trailer · Tue, November 09, 2021
How to Fix the Internet from the Electronic Frontier Foundation brings you ideas, solutions, and pathways to a better digital future for all.
S1 E6 · Tue, December 08, 2020
Chris Lewis joins EFF hosts Cindy Cohn and Danny O’Brien as they discuss how our access to knowledge is increasingly governed by "click-wrap" agreements that prevent users from ever owning things like books and music, and how this undermines the legal doctrine of “first sale” – which states that once you buy a copyrighted work, it’s yours to resell or give it away as you choose. They talk through the ramifications of this shift on society, and also start to paint a brighter future for how the digital world would thrive if we safeguard digital first sale. In this episode you’ll learn about: The legal doctrine of first sale—in which owners of a copyrighted work can resell it or give it away as they choose—and why copyright maximalists have fought it for so long; The Redigi case, in which a federal court held that the Redigi music service, which allows music fans to store and resell music they buy from iTunes, violated copyright law—and why that set us down the wrong path; The need for a movement that can help champion digital first sale and access to knowledge more generally; How digital first sale connects to issues of access to knowledge, and how this provides a nexus to issues of societal equity; Why the shift to using terms of service to govern access to content such as music and books has meant that our access to knowledge is intermediated by contract law, which is often impenetrable to average users; How not having a strong right of digital first sale undermines libraries, which have long benefited from bequests and donations; How getting first sale right in the digital world will help to promote equitable access to knowledge and create a more accessible digital world. Christopher Lewis is President and CEO at Public Knowledge. Prior to being elevated to President and CEO, Chris served for as PK's Vice President from 2012 to 2019 where he led the organization's day-to-day advocacy and political strategy on Capitol Hill and at government agencies. During that time he also served as a local elected official, serving two terms on the Alexandria City Public School Board. Chris serves on the Board of Directors for the Institute for Local Self Reliance and represents Public Knowledge on the Board of the Broadband Internet Technical Advisory Group (BITAG). Before joining Public Knowledge, Chris worked in the Federal Communications Commission Office of Legislative Affairs, including as its Deputy Director. He is a former U.S. Senate staffer for the late Sen. Edward M. Kennedy and has over 18 years of political organizing and advocacy experience, including serving as Virginia State Director at GenerationEngage, and working as the North Carolina Field Director for Barack Obama's 2008 Presidential Campaign and other roles throughout the campaign. Chris graduated from Harvard University with a Bachelors degree in Government and lives in Alexandria, VA where he con
S1 E5 · Tue, December 01, 2020
Abi Hassen joins EFF hosts Cindy Cohn and Danny O’Brien as they discuss the rise of facial recognition technology, how this increasingly powerful identification tool is ending up in the hands of law enforcement, and what that means for the future of public protest and the right to assemble and associate in public places. In this episode you’ll learn about: The Black Movement Law Project, which Abi co-founded, and how it has evolved over time to meet the needs of protesters; Why the presumption that people don’t have any right to privacy in public spaces is challenged by increasingly powerful identification technologies; Why we may need to think big when it comes to updating the U.S. law to protect privacy; How face recognition technology can have a chilling effect on public participation, even when the technology isn’t accurate; How face recognition technology is already leading to the wrongful arrest of innocent people, as seen in a recent case of a man in Detroit; How gang laws and anti-terrorism laws have been the foundation of a legal tools that can now be deployed against political activists; Understanding face recognition technology within the context of a range of powerful surveillance tools in the hands of law enforcement; How we can start to fix the problems caused by facial recognition through increased transparency, community control, and hard limits on law enforcement use of face recognition technology, How Abi sees the further goal is to move beyond restricting or regulating specific technologies to a world where public protests are not so necessary, as part of reimagining the role of law enforcement. Abi is a political philosophy student, attorney, technologist, co-founder of the Black Movement-Law Project, a legal support rapid response group that grew out of the uprisings in Ferguson, Baltimore, and elsewhere. He is also a partner (currently on leave) at O’Neill and Hassen LLP, a law practice focused on indigent criminal defense. Prior to this current positions, he was the Mass Defense Coordinator at the National Lawyers Guild. Abi has also worked as a political campaign manager and strategist, union organizer, and community organizer. He conducts trainings, speaks, and writes on topics of race, technology, (in)justice, and the law. Abi is particularly interested in exploring the dynamic nature of institutions, political movements, and their interactions from the perspective of complex systems theory. You can find Abi on Twitter at @AbiHasse n, and his website is https://AbiHassen.com Please subscribe to How to Fix the Internet via RSS , Stitcher , <a href="https://tunein.com/podcasts/Technology-Podcasts/EFFs-How-to
S1 E4 · Tue, November 24, 2020
Cory Doctorow joins EFF hosts Cindy Cohn and Danny O’Brien as they discuss how large, established tech companies like Apple, Google, and Facebook can block interoperability in order to squelch competition and control their users, and how we can fix this by taking away big companies' legal right to block new tools that connect to their platforms – tools that would let users control their digital lives. In this episode you’ll learn about: How the power to leave a platform is one of the most fundamental checks users have on abusive practices by tech companies—and how tech companies have made it harder for their users to leave their services while still participating in our increasingly digital society; How the lack of interoperability in modern tech platforms is often a set of technical choices that are backed by a legal infrastructure for enforcement, including the Digital Millennium Copyright Act (DMCA) and the Computer Fraud and Abuse Act (CFAA). This means that attempting to overcome interoperability barriers can come with legal risks as well as financial risks, making it especially unlikely for new entrants to attempt interoperating with existing technology; How online platforms block interoperability in order to silence their critics, which can have real free speech implications; The “kill zone” that exists around existing tech products, where investors will not back tech startups challenging existing tech monopolies, and even startups that can get a foothold may find themselves bought out by companies like Facebook and Google; How we can fix it: The role of “competitive compatibility,” also known as “adversarial interoperability” in reviving stagnant tech marketplaces; How we can fix it by amending or interpreting the DMCA, CFAA and contract law to support interoperability rather than threaten it. How we can fix it by supporting the role of free and open source communities as champions of interoperability and offering alternatives to existing technical giants. Cory Doctorow (craphound.com) is a science fiction author, activist and journalist. He is the author of many books, most recently ATTACK SURFACE , RADICALIZED and WALKAWAY , science fiction for adults, IN REAL LIFE , a graphic novel; INFORMATION DOESN’T WANT TO BE FREE , a book about earning a living in the Internet age, and HOMELAND , a YA sequel to LITTLE BROTHER . His latest book is POESY THE MONSTER SLAYER , a picture
S1 E3 · Tue, November 17, 2020
Jumana Musa joins EFF hosts Cindy Cohn and Danny O’Brien as they discuss how the third-party doctrine is undermining our Fourth Amendment right to privacy when we use digital services, and how recent court victories are a hopeful sign that we may reclaim these privacy rights in the future. In this episode you’ll learn about: How the third-party doctrine is a judge-created legal doctrine that impacts your business records held by companies, including metadata such as what websites you visit, who you talk to, your location information, and much more; The Jones case, a vital Supreme Court case that found that law enforcement can’t use continuous location tracking with a GPS device without a warrant; The Carpenter case, which found that the police must get a warrant before accessing cell site location information from a cell phone company over time; How law enforcement uses geofence warrants to scoop up the location data collected by companies from every device that happens to be in a geographic area during a specific period of time in the past; How getting the Fourth Amendment right is especially important because it is part of combatting racism: communities of color are more frequently surveilled and targeted by law enforcement, and thus slipshod legal standards for accessing data has a disproportionate impact on communities of color; Why even a warrant may not be an adequate legal standard sometimes, and that there are circumstances in which accessing business records should require a “super warrant” – meaning law enforcement could only access the data for investigating a limited number of crimes, and only if the data would be important for the crime. Jumana Musa is a human rights attorney and racial justice activist. She is currently the Director of the Fourth Amendment Center at the National Association of Criminal Defense Lawyers. As director, Ms. Musa oversees NACDL's initiative to build a new, more durable Fourth Amendment legal doctrine for the digital age. The Fourth Amendment Center educates the defense bar on privacy challenges in the digital age, provides a dynamic toolkit of resources to help lawyers identify opportunities to challenge government surveillance, and establishes a tactical litigation support network to assist in key cases. Ms. Musa previously served as NACDL's Sr. Privacy and National Security Counsel. Prior to joining NACDL, Ms. Musa served as a policy consultant for the Southern Border Communities Coalition, a coalition of over 60 groups across the southwest that address militarization and brutality by U.S. Customs and Border Protection agents in border communities. Previously, she served as Deputy Director for the Rights Working Group, a national coalition of civil rights, civil liberties, human rights, and immigrant rights advocates where she coordinated the “Face the Truth” campaign against racial profiling. She was also the Advocacy Directo
S1 E2 · Fri, November 06, 2020
Gigi Sohn joins EFF hosts Cindy Cohn and Danny O’Brien as they discuss broadband access in the United States – or the lack thereof. Gigi explains the choices American policymakers and tech companies made to create a country where there are millions of Americans who lack access to reliable broadband, and what steps we need to take to fix the problem now. In this episode you’ll learn: How does the FCC define broadband Internet and why that definition makes no sense in 2020; How many other countries adopted policies that either incentivized competition among Internet providers or invested in government infrastructure for Internet services, while the United States did neither, leading to a much of the country having only one or two Internet service providers, high costs, and poor quality Internet service; Why companies like AT&T and Verizon aren’t investing in fiber; How the FCC uses a law about telephone regulation to assert authority over regulating broadband access, and how the 1996 Telecommunication Act granted the FCC permission to forbear – or not apply – certain parts of that law; How 19 states in the U.S. have bans or limitations on municipal broadband, and why repealing those bans is key to increasing broadband access How Internet access is connected to issues of equity, upward mobility, and job accessibility, as well as related issues of racial justice, citizen journalism and police accountability; Specific suggestions and reforms, including emergency subsidies and a major investment in infrastructure, that could help turn this situation around. Gigi is a Distinguished Fellow at the Georgetown Law Institute for Technology Law & Policy and a Benton Senior Fellow and Public Advocate. She is one of the nation’s leading public advocates for open, affordable and democratic communications networks. From 2013-2016, Gigi was Counselor to the former Chairman of the Federal Communications Commission, Tom Wheeler. She advised the Chairman on a wide range of Internet, telecommunications and media issues, representing him and the FCC in a variety of public forums around the country as well as serving as the primary liaison between the Chairman’s office and outside stakeholders. From 2001-2013, Gigi served as the Co-Founder and CEO of Public Knowledge , a leading telecommunications, media and technology policy advocacy organization. She was previously a Project Specialist in the Ford Foundation’s Media, Arts and Culture unit and Executive Director of the Media Access Project, a public interest law firm. You can find Gigi on her own podcast, T
S1 E1 · Fri, November 06, 2020
In the inaugural episode of EFF's "How to Fix the Internet" podcast, the Cato Institute’s specialist in surveillance legal policy, Julian Sanchez, joins EFF hosts Cindy Cohn and Danny O’Brien as they delve into the problems with the Foreign Intelligence Surveillance Court, also known as the FISC or the FISA Court. Sanchez explains how the FISA Court signs off on surveillance of huge swaths of our digital lives, and how the format and structure of the FISA Court is inherently flawed. In this episode, you’ll learn about: How the FISA Court impacts your digital privacy The makeup of the FISA Court and how judges are chosen How almost all of the key decisions about the legality of America's mass Internet spying projects have been made by the FISC How the current system promotes ideological hegemony within the FISA court How the FISC’s endless-secrecy-by-default system insulates it from the ecosystem of jurisprudence that could act as a guardrail against poor decisions as well as accountability for them How the FISC’s remit has ballooned from approving individual surveillance orders to signing off on broad programmatic types of surveillance Why we need a stronger amicus role in the FISC, and especially a bigger role for technical experts to advise the court Specific reforms that could be enacted to address these systemic issues and ensure a more fair review of surveillance systems Julian is a senior fellow at the Cato Institute and studies issues at the intersection of technology, privacy, and civil liberties, with a particular focus on national security and intelligence surveillance. Before joining Cato, Julian served as the Washington editor for the technology news site Ars Technica , where he covered surveillance, intellectual property, and telecom policy. He has also worked as a writer for The Economist ’s blog Democracy in America and as an editor for Reason magazine, where he remains a contributing editor. Sanchez has written on privacy and technology for a wide array of national publications, ranging from the National Review to The Nation , and is a founding editor of the policy blog Just Security . He studied philosophy and political science at New York University. Find him on Twitter at @ Normative . A transcript of the episode, as well as legal resources – including links to important cases, books, and briefs discussed in the podcast – is available at https://eff.org/deeplinks/2020/11/secret-court-approving-secret-surveillance. Please subscribe to How to Fix the Internet using your podcast player of choice. If you have any feedback on this episode, please email <a href="https://www.eff.org/how-to-fix-the-internet-podcastmailto:podcast@eff.o
loading...