Hadooponomics: Big Government, Big Tech … Big Trouble? How Tech and Global Policy Can Co-Exist (Podcast Transcript)

Hadooponomics18Listen to the original podcast.

James Haight: You’re listening to the Hadooponomics podcast, and this is your host, James Haight. Pleasure to have you back here with us today. Our episode that we’re talking about is about the intersection of privacy, data, and policy. And for any of those out there who hear those words and sort of want to run away, don’t worry. We actually keep this fairly high level. It’s not too technical. What we’re focusing on is what’s happening in that world, and how it’s actually going to impact our lives, whether we work in the data industry, or we’re enthusiasts, or just sort of in our general lives as well. And to accomplish this, we’ve invited on, as our guest, Evan Swarztrauber, a really interesting guy who not only works for TechFreedom, this think tank in Washington, DC, for tech advocacy, but he hosts the Tech Policy Podcast. It’s a real treat to have him on, because, basically, his job is to talk to the most interesting, innovative, and informed minds in the world of tech policy, and to understand where the future is going. So we pick his brain about that. he’s got a lot of exposure to the Big Data world, and he’s got some interesting opinions, some interesting perspective, that I think you guys in our audience will really enjoy.

So a couple of highlights, for me, in this episode, really we talk about this tug of war between big government, big tech, and, of course, individual privacy advocacies. What happens when everyone’s trying to shape policy and impact our lives, and what the outcome of that is. And I think, even more interestingly, we pick Evan’s brain and we peel back the curtains a little bit, to see what’s actually coming up in the next year, two years, couple years from now, that’s going to impact our lives and our work, and just see what the landscape is going to look like. Both domestically, here, in the US, and, of course, internationally, especially over in the EU.

So a lot of good stuff to take away from there. I certainly learned a lot. It’s an area that I wasn’t as well informed with, and Evan really has a way of breaking it out for you. So a lot of good stuff there.

And on the housekeeping side of the world, for those of you keeping track at home, this is episode number 18 of Hadooponomics. And it’s awesome for me to just get to this point. But for any of you guys who’ve been tracking our patterns, we do six-episode seasons, which means that this is, of course, our last episode of Season 3. It’s been a wild ride, had a lot of fun. If you remember back to our episodes this season, I think we’ve been growing exponentially in terms of our audience tuning into the show, as well as the caliber of conversation that we get to have, the impact that we get to have. We’ve covered everything from using data to find the DC Sniper to why you should never actually think about and build your own Big Data infrastructure, to even breaking the Panama Papers. So it’s been a wild ride, it’s been a lot of fun. So just be aware of that, as we go into our winter hiatus, and be on the lookout in 2017 as we pop up with Season 4. And, of course, on that note, if you miss us, you wanna catch up on some old episodes, whatever the case is, bluehillresearch.com/hadooponomics. We have links to all our shows, of course, the transcripts, the show notes, some links back to interesting things that our guests have talked about. And, of course, this episode is no exception, we’ll have links back to Evan’s podcast, his organization, and some of the cool things that he mentioned as well.

So that’s it for me today, I’m gonna step aside. Enjoy your interview with Evan.

All right, everyone, welcome to the podcast. Today we have an awesome guest for you. I’m here with Evan Swarztrauber. Evan is a Communications Director at TechFreedom, and also happens to host the Tech Policy Podcast. Evan, welcome to the show.

Evan Swarztrauber: Thanks for having me.

James: So, Evan, we’ve got a lot of stuff to talk about here. I’m really excited to have you on the show. But before we dive in to all the policy and the topic of the show, why don’t we just take a step back, and I’d love for you to introduce yourself to our audience.

Evan: Well TechFreedom is a free market, tech policy think tank. Some have described us as libertarians. My boss prefers the term “dynamist”, but you’ll have to probably look that one up. And we work on a lot of tech policy issues. It’s fairly easier to say what we don’t do. We don’t do intellectual property, that’s what my boss would call the Vietnam of tech policy. There’s no winners there. [laughs] But, of course, we work on Internet law and policy, telecom, privacy, surveillance, encryption, and a lot of consumer protection issues as well. So we’re based in Washington, DC, but always happy to work with people across the 50 states to advance policies that make innovation possible.

James: Mm-hm, yeah, and it’s great to have you on the show. Part of what makes me so excited to have you on here is, one, the TechFreedom organization, I think, is very interesting, and I think something that a lot of people in our audience can get behind, and would be interested to know what you’re doing. But you also host the Tech Policy Podcast, which I’ve had the chance to listen to. Really enjoy it, you get some pretty cool guests. Can you just kind of give us a little bit of background about that, and then we can work that into how we’ll take this interview forward?

Evan: Yeah, absolutely. When we were thinking about our starting our podcast, we looked to see if there were other podcasts that were really only focused on tech policy. We didn’t see that many. There’s a few, but what we mostly saw are podcasts that focus on tech gadgets. Like, this is the new iPhone, we’re gonna break it down, we’re gonna open it up, that kind of thing. But really, what we’re talking about is all that nerdy, DC policy and law that goes with technology and innovation. And, of course, maybe in a perfect world, us libertarians wouldn’t want to see such a need for advocacy in Washington to support innovation and to avoid dumb regulation, and things like that, but obviously that’s not the world we live in. And we think that those who are savvy, who are working on technology, and working on innovation know that eventually they’re gonna have some regulatory issues, so they’re gonna have to engage in the political process. So the podcast is a great way to explain these issues in, not a 30 second soundbite, but more around the length of this podcast. And we’ve had a lot of guests coming in to talk about a variety of issues. Of course, we have our own slant on things, but we try to [laughs] get a big tent of people. So we’ve thought it’s one of the best ways to talk through some of these complicated issues, but also in a way that both wonks will appreciate, and a layperson can understand as well.

James: Mm-hm, yeah, and what I think is a real treat is we’ve had a few podcast hosts themselves on this show before. People might remember The New Stack, our audience might remember that, or Ryan Goodman from Analytics On Fire. What I love about it is, as a host, you guys get a chance to borrow the brain and understand the perspective of 30, 50, 100 people every year, just talking to them and understanding what these latest and greatest thinkers are doing, and what’s on the horizon. You can also distill it into one point. And the guests that you guys have a had, you’ve talked about everything from autonomous cars, to the impact of Ed Snowden, to what’s happening in the world of Big Data next on the policy side. And so what I wanna do is invite you on here to talk about what you’re seeing in the world as it pertains to the world of Big Data. What’s on the horizon, what should we look out for, what should we be caring about, and what, maybe, is not on our radar that needs to be? And so that’s what I think the real opportunity of having you here on the show is. And I’d like to sort of just open up the floor to just let you wax poetic, for lack of a better term, about what you’re seeing. Just lay the groundwork in terms of what’s ahead of us in the world of Big Data. Then we can sort of pick it apart, and then go into the most interesting areas from there.

Evan: So at a macro level, I see the issue as being, I think, two parts. There’s the government side of Big Data, which we associate with things like the Snowden revelations, with mass collection of data, with spying, with foreign intelligence programs. And then there’s the commercial side, and that’s where you get into Big Data, with Google’s advertising, and Facebook, and edge companies collecting data about their customers and using them for advertising, but also other things: autonomous vehicles collecting traffic patterns to make us more safe. And depending on where you are in the world, those two issues are not distinct. I think Americans, for the most part, see them as two separate issues. They say, when I engage with a technology company, that’s a contractual relationship. I’ve agreed to a set of terms and conditions, and even though I almost never read them, I understand that there’s a transaction there, and that’s why I’m getting this service for free. But then we’re also skeptical of Big Brother, and after what Snowden revealed, we were very skeptical about the relationship between companies and the government.

But if you look at a place like Germany, there really isn’t that much of a distinction, and part of that is informed by history. Of course, the Stasi, as recently as a few decades ago, was a big problem there, and it really does inform the way a country approaches surveillance policy. Whereas in the United Kingdom, a long history of surveillance by the government, you had a monarchy for a long time. They don’t really seem to care about much. And America’s maybe somewhere in between. But depending on where you look in the world, there are concerns about what the NSA’s doing, what their own governments are doing, and what companies like Google and Facebook are doing.

James: And so what I wanna do in the second half of this episode is dive in to the intercultural differences between where you are in the world, and sort of this dichotomy of the issue. But to start off, a lot of our audience, there’s a lot of data scientists, Big Data practitioners, Big Data enthusiasts. And I think I’d like to start off by talking about what’s on the horizon that’s gonna impact the way they do their jobs. The type of data they can get, the type of data they can’t get, the amount of hoops that they have to jump through. And it strikes me that you’re in a really good position here, you have an interesting perch to sort of peel back the curtains and see what’s coming up next for us.

Evan: Well if we look at the recent attack on the company, Dyn, or is it Din? [laughs] I think that’s where my lack of industry knowledge might be showing. But that was an attack that involved the Internet of Things, and that’s a big hot topic in tech policy right now. Because we’re talking about a future where it’s not just our phones, and our tablets, and our computers which might have cybersecurity tools built in. We might be talking about much less sophisticated devices being connected to the Internet, like coffeemakers and refrigerators. And, of course, there’s a huge cybersecurity concern there, and there’s a privacy concern. But we could also see the benefits. I mean, if you’re a big data scientist, the idea that entire homes are gonna be connected to the Internet, and data are gonna be sent to places where it can be analyzed, that’s exciting, right? I mean, the potential for better services, for improved products, for getting a sense of how people use products, and how to save money and energy. There’s all these great benefits. But, of course, as with all technology, the vast majority of its use will be for good, but all it takes is a handful of very high profile hacks or bad actors to really make people skeptical. And until, maybe the Internet, when it was first invented, so to speak, it didn’t have security in mind. It was much more about just the fun of connecting the world. But now when you look at something like the Internet of Things, a lot of cybersecurity experts and consultants are saying, cybersecurity has to be embedded in the Internet of Things. It can’t be some next step we take later on. If we want to see the promise of Big Data, and we want people to not be fearful of connecting more and more devices to the Internet, then cybersecurity’s gonna have to be security by design, is the term that they’re using. And you look at autonomous vehicles, there’s a case where concerns over privacy and cybersecurity could get in the way of life-saving technology. So these are these balances that need to be struck, between privacy and cybersecurity on the one hand, but also not hamstringing Big Data so that we can’t realize its promise.

James: Mm-hm, yeah, absolutely, right. You bring up autonomous cars, I can think of almost [laughs], obviously, the scariest scenario is you’re in your car and someone hacks in, and you have no control over being able to stop, or where you’re going, and what’s gonna happen to you, right? And that is sort of a scary, doomsday scenario, but one that strikes me as incredibly plausible, unless security is embedded into, or ingrained into the very fabric of the connected devices, right?

Evan: Yeah, absolutely, and we have to ask ourselves, what’s the safest way of having autonomous vehicles, because you could envision multiple scenarios. It could be the vehicles communicating with one another. It could be the vehicles communicating with the road. It could be a combination, or it could be each vehicle just relying on itself, which, in theory, might lessen the cybersecurity risk, if every single vehicle is just its own thing that looks around and reacts to its surroundings, but doesn’t communicate with them. And these are obviously important questions, and we wanna make the right decision. And we wanna say the safest way to avoid that doomsday scenario might be vehicle to vehicle, and not vehicle to road. Maybe the idea that the entire highway system is Internet connected, maybe that’s too dangerous.

But what I wanna see is, I wanna see those studies happen. And I would hope that we allow people to test them, and that we have friendly environments to testing. We don’t let hysteria and concern over hypothetical problems to stymie that spirit of innovation that might lead to a good outcome. So there are certainly states that are leaders, in terms of testing, like Pittsburgh, for example, allowing Uber to do self-driving cars. California has a fairly permissive environment for testing. So if states wanna get a competitive advantage, maybe they can pass laws that allow for testing, and allow for an exploration of this technology, even though there might be some hiccups along the way. And you might have some accidents. And you really have to ask yourself, are you comfortable with the trial and error, and the messiness of the future? Or are you going to have the precautionary principle, where government comes in and sets very strict standards, and then companies have to clear really large hurdles before taking advantage of new technology?

James: Mm-hm, and you hint at the area that I wanna go in next. And we had a really interesting bit pre-show, when we were talking about there’s legislation on the horizon, the future’s going to be shaped by the way that we push forward policy. And my question was, where is it coming, or who’s driving it? Is it something that’s coming top down from folks in DC? Is it something that’s being pulled from people in Silicon Valley? Is it Facebook and Yahoo, and those companies, and Amazon, forcing the policy change to improve their own products and services? Can you kinda give us a lay of the land of what that looks like, in terms of what’s actually shaping the policy ahead of us?

Evan: Right, of course, all of these different actors in the space have very different interests. So if you’re a company, you’re more concerned about that government element than the commercial element. Because if you’re a company, you want to be able to collect data, to sell data, to engage in the business models that have made the Internet so successful, and have made a lot of the products we take for granted free. It’s not really free, I think it was Mark Zuckerberg who said if something is free, then you are the product. And those companies, they’re gonna push back against the government, largely because of Edward Snowden. I mean, if you’re very cynical, and you think companies are only out for themselves, well, after Edward Snowden, there was a real incentive for companies to better secure their products and to fight back against government. Because now it’s a worldwide thing that’s been revealed, and you can’t just make these secret deals with the government.

But, of course, if you’re a civil society activist, maybe you’re just a privacy activist, you’re less concerned about companies’ bottom lines, and you’re more concerned about privacy, both commercially and from government. So you’re gonna fight for the best privacy possible, including putting restrictions on companies, and regulating them.

And then, of course, you’ve got the intelligence community, and their number one job is to stop terrorist attacks. And they don’t particularly care about companies’ bottom lines, and they don’t necessarily care about what the civil society/privacy activists have to say. Their number one goal is, I’m gonna stop attacks.

So all of these forces are yanking at policymakers, and you’re seeing the US government reacting in certain ways, and you’re seeing the European Union react in certain ways. And there’s certainly a difference in how those two institutions are handling this.

James: Mm-hm, and so, in a second, let’s dive into that difference, because it’s huge, right? I think a lot of people who are, perhaps, North American-centric don’t have an appreciation for it, right? But before we do that, what I’m curious is, is there a general flavor or general direction of the policy in the aggregate? Obviously we have at least three different parties with different priorities pulling at it, presumably, but in a tug-of-war, one side usually wins, right? Is there any way, do you see the direction that we’re moving towards? Or is it too messy to tell?

Evan: It’s really hard to say. Obviously, in an election year, nothing is happening, right? I mean, this is such a toxic issue, whenever you deal with surveillance issues. We’ve seen some changes, right? So when Edward Snowden first did what he did, the biggest revelation was that telephone companies in the United States were basically doing, the government was doing mass collection on all of our telephone calls. And not the content, necessarily, but the duration of the call and who you’re calling. And you can tell a lot about a person from this kind of metadata. And because that program was so indiscriminate in nature, it was really mass surveillance. It was very Orwellian. That program got shut down by a law called the USA Freedom Act, which is kind of ironic, because the law it was reforming was the USA Patriot Act.

James: [laughs]

Evan: We’ve always got this wonderful naming system in the United States, where the law is named something positive, even though its contents might be complete crap. [laughs] But the USA Freedom Act did reform that program. And part of that was that it was a program on Americans, right? It was domestic surveillance. So Americans were saying, hey, I’m not a terrorist, why are you collecting all of my information all of the time, without any probable cause? But now you’ve got other programs that Edward Snowden revealed that were surveillance conducted on foreigners, and that’s a very different equation. A United States politician doesn’t particularly care what Germany might think about our intelligence programs. Or what other people around the world care. They’re thinking foreign surveillance is meant to protect the homeland, so we’ve gotta keep doing that.

And then you’ve got the way law enforcement, your local police department, what do they do with data? And all of these interests are competing. And what we’ve seen, because of the election, and other factors, in the way the government has been operating, we’ve seen a lot of stagnation. For example, the House voted unanimously, 418-0 to make it that law enforcement always have to get a warrant to access email. A basic thing that we can probably all support. And the Senate just did not move, right? And now we’ve got an election going on. So it’s amazing how, in the tech industry, innovation just keeps happening, and companies keep chugging along. But there are all these policy implications that are kind of up in the air.

James: Mm-hm, yeah, and I think to debate the merits, anything other than just observing the logjam is, I think, beyond the scope of this show, right? We try to keep it on the tech side. [laughs] But one of the things I wanna do, and switch gears into, a little bit, because I think we could talk about this all day, and we’ve sort of expanded on that subject for a lot of our past episodes.

Evan: [laughs]

James: So you hinted it at the beginning, right, there’s a legitimate difference between how the rest of the world perceives these issues, and just data in general, right? And the EU seems to be either far ahead, or way off, right? Let’s leave judgments about it aside, but can you give us a flavor of what’s happening in the EU, how is their policy regarding data different than it is here in America and the rest of North America?

Evan: Well if you ever go to Europe, and you’re in a member state, and I mean EU member state, open up your computer and just browse the Internet. You’ll notice it. I was just there recently. Every single website, a big message will come up about cookies and data collection. You have to consent every single time. And in the US, that doesn’t happen, it’s just assumed. So what you’re looking at is the difference between an opt-in regime and an opt-out regime. And it might seem very trivial, but you look at the way that the US tech sector has evolved, and the European tech sector, and they’re very different. I mean, look at the top ten tech companies in the world, they’re all American, a couple Chinese. And that’s not really a fair comparison, because China’s a very centrally planned economy, and the government has a lot of influence. But if we’re comparing industrialized Europe and the United States, it’s a fair comparison. The Western world, so to speak. Europe doesn’t really have these big tech giants, and a lot of European start ups, their goal is to move to San Francisco or get bought by an American company. And you ask yourself, well, why? Why is it so unfriendly? And part of that is the commercial privacy stuff. I mean, the difference between opt-in and opt-out could be the difference between success and failure for a business. If Americans are much more comfortable handing over data in exchange for services, then we might be a more successful market than Europe. And we’re seeing a backlash against American tech companies in the form of anti-trust regulation and big fines. They proposed a $3 billion fine on Google for being a monopolist in the search engine market. So you’ve got, clearly, a backlash against American tech companies. The European Commission, the European Union, these regulatory bodies, they’re very skeptical about American tech companies and their data practices. And part of that skepticism is fueled by the Snowden revelations, which showed a cozy relationship between the intelligence apparatus and American firms. The point I would make is that Europeans should also look at what their own governments are doing. After the horrific terrorist attacks in Paris, France passed a bill that privacy activists would not say is very good. Yet there seems to be an undue influence on the United States because of Edward Snowden, when really, governments all over the world have Big Brother problems.

James: Sure, and so bringing this a little bit more to the very tangible, Big Data side, I spent a long time covering the Microsoft Cloud, Amazon Cloud, Google Cloud, and one of the huge differences in Europe is now you have to have data centers centrally located, right? You can’t necessarily be storing your data, other people’s data, if you’re an organization, unless that data is stored somewhere in the EU. Curious if you can talk to that a little bit, and sort of how you see that shaking out in the future as well.

Evan: Yeah, data localization is a big issue. If you think about the Internet as being borderless, and a global entity that is not controlled by any one country, data localization would be the opposite of that. That would be the idea that each country has its own Internet, and each country decides what that Internet experience looks like for its citizens. And you might understand the impulse, right, why a country and its people would want their data being stored in their country, right? So the basic idea is, I’m Irish, I have a Facebook account, it’s stored in Ireland, or I’m Romanian, and it stays in Romania. But really, that kind of contradicts the promise of the Internet, the idea that companies can seek out the best arrangements based on, maybe, cheap electricity, or where they can get the space to have a data center, where it makes sense to put servers. You want those decisions being made based on what’s best for consumers, maybe what’s best for the company, what’s gonna lead to the most innovation, not based on nationalism, and saying it needs to be here because this is my country, and the data have to be here. And, of course, that’s also a law enforcement concern, right? If data are stored in your own country, it’s a lot easier to get a warrant to access it. You don’t have to go through international processes and make arrangements with other countries about how we exchange data.

But that’s a big problem. I mean, if you’re a tech company, data localization’s not good, and you might just decide to pull out of certain markets entirely, rather than make accommodations for individual governments.

James: Sure. So let’s take that and bring it one more step, bring it back to some of the folks in our audience. We have a lot of, as I mentioned, practitioners as well as enthusiasts, and people who, their jobs will be impacted by what’s gonna happen in the next one, two, three, five, ten years. What’s on the horizon in terms of this landscape shifting? Are we gonna see more data localization? Are we gonna see more restrictive company policies? What’s going forward here, what should we expect?

Evan: So one big issue concerning the relationship between the US and the EU that is definitely alive is cross border data flows. So obviously your data scientist listeners know that we transfer data across the Atlantic, it comes back over. And in the year 2000, there was an agreement called Safe Harbor, that allowed for that to go through, and it was, essentially, an agreement that companies would agree to a set of standards about protecting the data. That agreement got struck down as the result of the Snowden revelations and the lawsuit. A new agreement, called Privacy Shield, you gotta love these hilarious names, came up, but that’s also on shaky ground. And really, the big issue, if you’re a Big Data guy, you need the trust of the citizens that you’re collecting data about, right? So think about, in Europe, there’s a distrust of American companies, there’s a distrust of the intelligence apparatus. If you want those customers to be comfortable handing over data that can be very useful, that can lead to new innovations, you need to get their trust. And what the United States needs to do, on a policy point, is clarify the way it spies on foreigners, and put some safeguards and some oversight there that will really be enough to let Privacy Shield hold up.

So right now, this agreement that, you ask any company, you ask Google, Amazon, Facebook, is Privacy Shield a good agreement? They say, yes, of course it is, because that’s what we have now, and that’s what’s allowing us to do business. But, of course, maybe a better agreement, with more assurances, will stand the test of time. And now the United States has until December 31, 2017, that’s when a certain foreign surveillance program expires, related to the Patriot Act. They have until then to do some substantive reform, to really convince the citizens of Europe, and others around the world, that there is adequate oversight, there is due process, that foreigners are not being spied on in an inappropriate way, and that counter-terrorism can be done with respect to people’s privacy. If we can get those reforms in place, then what’s on the horizon is very good, because then companies can continue to transfer data across the Atlantic Ocean. They can continue to have a global Internet, not a localized Internet. And that’ll be really good. But we’re gonna see that debate heat up in the next few months, and as we approach that expiration date on that program. And you’re gonna see Europeans voice their opinions, companies, civil society groups, human rights activists. It’s gonna be an interesting debate. But I think the United States, if we are able to seriously reform our surveillance practices and convince the world that we’re not doing anything too terrible [laughs], then it’s very good, the outlook is good for Big Data and Big Data science.

James: And so, along those lines, and I think that while we have you on the show we’ll take advantage of your perspective, you have the opportunity to talk to a lot of people on an ongoing basis as part of your job at TechFreedom, as well as just being the host of the podcast. Aside from that, what trends are on the horizon that you think are not on our radar yet but definitely should be?

Evan: Certainly the autonomous vehicle debate should be on your horizon. There are a lot of groups, right now, that are pushing for the FCC to get involved there. And you might be scratching your head, because when you think about transportation, you think about maybe the Department of Transportation, the DMV, you think about those regulators, not necessarily the FCC. But this FCC has gotten very activist. It’s been issuing a lot of regulations. You might have heard about Net Neutrality, that was a big debate. And there are people saying the FCC needs to be approving of autonomous vehicles. And for dynamists, for libertarians, that’s a bit of a head scratcher. It’s kind of worrisome. I mean, we wanna see experimentation in autonomous vehicles. We think it has the potential to stamp out most of the roughly, what, 30,000 vehicular deaths a year, 3,000 alone in California, which is the cradle of innovation. And we wanna see that testing and that innovation go forward unfettered by concerns over privacy and cybersecurity. We’d like to see those concerns addressed, if they arise, right?

James: Sure, and you raise a good point. It was a small part of what you said, but I think has enormous implications. We have to figure out the insurance issue. And in my mind, that’s ripe for massive disruption in the entire insurance industry.

Evan: Absolutely.

James: Because if auto accidents decrease by 99%, what do you do if you’re an auto insurer, right? [laughs] You can’t possibly write policies that are gonna be profitable if no one’s ever getting in car crashes, right? So there’s-

Evan: Right.

James: Massive disruption as sort of a total effect stemming from that. And I think, obviously, the insurance industry’s not gonna be alone. I think there’s a lot of implications that we need to think through.

Evan: Yeah, absolutely. And if you think about the way that industries can adapt, I mean, if I’m an insurance company right now, and I’m used to writing policies about human problems, human liability, I’ve gotta start thinking about robots. I’ve gotta write a policy that accounts for the very small chance, maybe, that this robot screws up. Or that weather might cause problems. I mean, that’s a very interesting thing, to pivot from the human world, where human error is 90 something percent of the reason we have car accidents, and now going to cybersecurity. And maybe the insurance has nothing to do with the car crashing, maybe it has something to do with your data being hacked, and that’s an interesting equation. But if I’m a company, and I have the resources to be a little savvy and get ahead of this, I should start thinking about how to write those policies.

James: Great. Well, Evan, one of the things I always like to do is, I’ve had a chance to listen to your podcast, I’d recommend it to our audience, but if we wanna find out more about who you are and what you’re up to, where are we going? Where can we go to sort of find out more?

Evan: Absolutely. So techfreedom.org, of course, that’s our website. You can check out some of the work we’ve done, our filings, the work we’ve done, regulation, suing against government overreach, all that good stuff. And then to keep up on these issues, I mean, there’s no better source than the Tech Policy Podcast. We’re always talking about a wide range of policy issues. I mean, you can catch that at podcast.techfreedom.org. You can also just search Tech Policy Podcast in iTunes, Stitcher, or wherever you get it. And then follow us on Facebook and Twitter. That’s a good way to get in touch with us whenever you want. And yeah, I hope you enjoy our work.

James: Excellent. Well, we’ll have links to that up on the Bluehill website, so in the show notes we’ll have links back to the podcast and to the website, and all that stuff. So Evan, it’s been a real pleasure to have you on the show. I just wanna say thank you for coming on.

Evan: Thank you so much, and I’m looking forward to having you on my show.

Posted on by admin