31: User Privacy as a Differentiator with Adam Stone of Secure Digital SolutionsPublished March 3, 2020
Run time: 00:58:23
User privacy can be a key differentiator between you and your competition. Adam Stone of Secure Digital Solutions joins us on the show to talk about several concepts around privacy, including the California Consumer Privacy Act (CCPA), Privacy by Design, risk management around third party dependencies, and how you can take steps in your process to ensure your user’s private information is secure.
In this episode, you will learn:
- Why you (might) need to care about the CCPA
- How millennials might find your company more appealing if you focus on privacy
- Ways to vet your development team to ensure they think about privacy
- Why you need to think about your values before assessing your third party dependency risks
This episode is brought to you by The Jed Mahonis Group, where we make sense of mobile app development with our non-technical approach to building custom mobile software solutions. Learn more at https://jmg.mn.
Recorded February 5, 2020 | Edited by Jordan Daoust
Tim Bornholdt 0:00
Welcome to Constant Variables, a podcast where we take a non-technical look at mobile app development. I'm Tim Bornholdt. Let's get nerdy.
Today we are chatting with Adam Stone of Secure Digital Solutions. Adam is a data privacy and security executive with more than 20 years of experience implementing and developing data privacy and security innovations. In this episode, we talk about the California Consumer Privacy Act, the CCPA, and why you should care about user privacy, the principles behind the privacy by design framework and how you can incorporate that into your app, managing third party dependencies within your app, and we touched on the battle between the FBI and Apple surrounding end-to-end encryption. So without further ado, here is my interview with Adam Stone. Adam Stone, welcome to the show.
Adam Stone 1:07
Hi there. Thanks for having me. Tim.
Tim Bornholdt 1:09
I'm really excited to have you. Like we were kind of talking about before the show, I think we're both kind of privacy nerds in the best sense of the word. So I think this is going to be a fun discussion for folks that maybe aren't as comfortable with the whole privacy and security angle of app development.
Adam Stone 1:27
I appreciate that. And yes, indeed, I'm a nerd and I've tried to de-nerdify the things that I have to say about the topic.
Tim Bornholdt 1:37
That's what this is all about. I like that, de-nerdifying. Tell us about yourself and tell us about Secure Digital Solutions and what you guys offer.
Adam Stone 1:46
Well, thanks a lot. I am a Twin Cities based consultant, working for a firm in the Minneapolis area, basically St. Louis Park, called Secure Digital Solutions. The firm has been around since 19-, excuse me 2005. It started out as something more of a traditional IT security consultancy, focusing a lot more on the technical aspects of security but also management aspects of security. It has evolved over several years. And we can now say that we are a management consultancy that focuses on data security and privacy management, operational performance, organizational improvement, governance, compliance and strategy. With that, I've been with this firm for five and a half years and I am responsible for our professional services. We have a second pillar of the business which is in fact an app, a software app, business to business software app, that is called Trust Me. And its purpose is to enable Information Security leaders to track or to measure and track their performance relative to information security maturity.
Tim Bornholdt 3:14
Adam Stone 3:15
For me, I've been in the privacy and security business for a little over 20 years. I fell into it by happenstance. I do not have a technology background. I did not have it when I entered the business. Instead, my background, well, my first degree is in philosophy. And I used that degree to jump right into the world of accounting. And I spent a few years doing that and transitioned from that to becoming a professional trainer focused on helping accounting managers, CFOs and the like, use software designed to help their business. And then around the dot com bubble, I had fallen into this world of data privacy and security. And I haven't really looked back. I've had lots of opportunities to serve across myriad industries, starting in financial services and insurance and branching out into academia, and marketing, manufacturing, and pharmaceutical production, distribution and healthcare and have enjoyed my time working in and for and on behalf all of these different industries and been happy to be part of the growth of the privacy industry and the privacy profession over the past 10-15 years.
Tim Bornholdt 4:57
Well, I mean, I think going into it from a philosophy angle and moving into privacy, I mean, there's so many things you could touch on on theory, and, I mean, there's a whole thing we're going to talk about with privacy by design where I can see why it's good that you have that full philosophical background because I think it lends itself well to this industry.
Adam Stone 5:18
Well I would agree. It's, you know... Privacy, and even information security, is not a technology subject in my estimation. It is a human subject, specifically human fallibility, human greed, and other negative character traits that lead to our need for securitym need to secure our information assets but also to preserve the secrets that we share with folks, whether we are sharing these secrets with our best friend or with some large corporation or with the government.
Tim Bornholdt 6:02
Well, that lends nicely into our first topic of discussion here, which would be around the government. And I want to talk a little bit about the CCPA. So I know that a lot of our audiences is, you know, app owners and product owners, and they probably have heard of this acronym, know it has something to do with California. And I mean, we're here in Minnesota, we don't need to worry about that. Right?
Adam Stone 6:27
Well, I wouldn't say that. California tends to be the leader; they stand in the vanguard with issues like this. And CCPA is no different. The acronym, by the way, stands for California Consumer Privacy Act of 2018. But we'll just call it CCPA because that's easier.
Tim Bornholdt 6:52
Talk about it a little bit. Like in terms of an app owner, what about it? Why does it exist and why do I need to worry about it?
Adam Stone 7:02
CCPA has behind it a long history of discussion and debate about the role of these so called computerized systems or databases, which really started concerning people even as early as the 50's and 60's. And it really is just a natural evolution of these sorts of concerns on behalf of the public, both, you know, when looking at it from a government ownership of or control of data, but also now importantly, in the control of data in the hands of private corporations. The law itself emerged out of an activist in California. This individual, he turns out to be a very successful business person, had a lot of resources behind him, and had agitated for a new data privacy standard and had effectively put forward a bid to get a referendum on the ballot in 2018. That would have created substantial privacy requirements for organizations if it had passed. In response to this threat of referendum, the legislature within the state of California decided to create a bill to address these issues and because of the timing, the bill was rather hastily put together. In fact, it emerged almost out of nowhere and caught folks that are not watching California politics closely, really off guard. In essence, the law though in the early days, the media often characterized the law as something similar to the E.U.'s General Data Protection Regulation or GDPR. There are really quite a few differences between CCPA and GDPR. Though the basic mantra that is defined within GDPR around the providing and preserving rights to individuals with respect to privacy, that maintained, but it really takes a uniquely American approach to privacy. And that is in fact where it diverges from the GDPR.
Tim Bornholdt 9:42
It's interesting that there's these regulations, for one, coming from GDPR. And I heard you on an earlier podcast kind of explain the privacy from the European standpoint is, you know, baked in this tradition of fascism that they have in Europe. And so you know that there's a certain amount of privacy that you would want to have after going through a regime like that. And then there's kind of the American entrepreneurial, you know, look at privacy where well, you know, someone's giving me this data. We can argue willingly or unwillingly. And since they gave it to me, it's mine. I can do whatever I want with it. I am curious to hear what your thoughts are around why we need to worry about privacy. And as an app owner, I've got 600 other things I need to worry about with my budget and with timelines and with user experience, and with everything else that goes into building an app. Why should I care about the user's privacy?
Adam Stone 10:41
Well there's all sorts of good reasons. And there's different angles that folks take. There is the angle of the fear of government overreach and that if you enable governments to maintain all sorts of data about you and to hold your deepest secrets, that makes it all the easier for government to control you and the population at large. It can also lead to societal catastrophes, such as we saw in Europe starting in the early 30's. And that really did create a sea change in the way folks understood the importance of keeping their secrets to themselves. The question around secrets, however, is an interesting one, because there's obviously two schools of thought, roughly speaking, around the secret's keeping. And one is that if you've got a secret to keep then clearly it must be, you know, something negative about you that you don't want to get out to the public. Perhaps you've done something illegal; you don't want the authorities to know about it. And those folks tend to be a little bit more antagonistic when it comes to privacy and a little bit more cynical about the reasons why folks assert the need for privacy. The other camp sees privacy as a key component in the kind of human psychological health as well as societal health. And naturally, the threat of physical harm in the event of a breach of one's privacy is always present, whether it's physical or financial or humiliation or discrimination. There have been myriad reasons throughout history why folks feel inclined to hold certain secrets about themselves, while disclosing selectively other information about themselves to parties that they either trust implicitly or they have to trust by virtue of circumstances around them. My personal opinion is we're in that kind of latter state where we are, you know, in a position where we effectively have to trust the organizations that we divulge our secrets to. Because, frankly, to survive in modern society, it's virtually impossible to get off the grid, quote unquote. And with that stated, we have this innate need to trust the individuals with whom we are disclosing our secrets to. And, you know, sometimes we just have to trust that the companies that we do business with will do the right thing with our data. And only after those companies breach our trust by some bad act or some case of negligence, or leading to a data breach, for instance, that's when we might lose trust in those organizations and choose not to divulge any more secrets to them.
Tim Bornholdt 14:05
Well and that would render a lot of, you know, what we do as app developers useless. A lot of times we're trying to provide value with that data. And there's a lot of good reasons to collect that information. But I think if you don't have good intentions from the get go in that you're not thinking about how do you protect this data once it's been given to you and treat it with respect and care, everyone's just one hack away or one step of negligence from an employee away from being on the front page of showing how bad it is to trust you with your data.
Adam Stone 14:39
Yeah well that is the reality and for better or worse. We've been brought up, most of us in Western society, have been brought up with this notion of sort of the golden rule, and the variations of that ideal. And part of the golden rules we know is to do unto others as you would have done to yourself. And that doesn't seem to always carry through in commercial relationships. And it's unfortunate. But those were the early days. My sense is that a lot of organizations are recognizing the business value of identifying trust as a key part of the relationship with its client base. And obviously, the profit motive is a big part of that. I'd like to think that we are thinking with a bit more of an ethical lens, as we mature around this notion of app development. And even just what is the internet and what purpose does it serve from a social from a social perspective?
Tim Bornholdt 15:56
Absolutely, with that in mind, let's say we are working for an organization. And we do want to build that trust; we see the value in privacy and protecting that privacy. I know that you've talked in the past about a framework that people can follow called privacy by design. Maybe we can talk about that a little bit and explain what that is and how that can actually serve as a way to kind of enforce that belief that privacy is important from the ground floor when we're building mobile software.
Adam Stone 16:29
I really appreciate you bringing this up. I am a huge advocate for privacy by design. Interestingly, this is a topic that is not a new area of thinking we're on privacy. It is starting, albeit slowly, to pick up some steam within the app development community. And I like to think of myself as, you know, one of the folks that really promotes the implementation of privacy by design in an effort to improve trust between app developers and their clientele. Privacy by design, or PVD as folks in the privacy industry call it, is a set of principles that was developed by a wonderful lady named Ann Cavoukian. Ann at the time was serving as the Information and Privacy Commissioner for the province of Ontario in Canada. She developed this document around 2009. And at the time, it had kind of emerged quietly, but its principles are so salient that it has maintained and it has really aged like a fine wine. In my view the principles that she brings forward have not needed any form of enhancement or change. They seem to be a pretty solid set of principles. And there are seven of them. And I'm sure that we'll talk about them just as we go along here.
Tim Bornholdt 18:08
I was gonna say, what are some of those principles?
Adam Stone 18:11
Principle one is that app developers should consider privacy proactively not reactively. And it should consider an approach that is preventative of one's private information, preventative in that it prevents exposure of one's private information versus having to remediate these circumstances after the fact. And so, principle one is privacy, excuse me, proactive, not reactive, preventable, preventative, not remedial. Number two is privacy as the default setting. And when we get a moment, I'd love to talk about this a little bit more. It's one of my favorite principles, privacy as the default setting. Foundational principle three is privacy is embedded into the design. And that's, of course, very relevant here for software developers, software and web app developers. Privacy's embedded into the design from ideation to sunset of a particular application. We'll talk about this more in a bit. Number four is full functionality, meaning that the developers should develop towards a positive sum approach, not a zero sum approach, meaning that developers should not create a system that says either you use my system and you give away some of your privacy, or you don't use my system. You choose. That's a zero sum game. A positive sum game provides some balance between the giver and the receiver, as it were the client and the app developer. So principle four is full functionality, positive sum, not zero sum. Principle five is end to end security, full lifecycle protection. And I think that makes sense inherently. We can talk about that in a little bit. This is where security comes into play. We know that security is a key enabler to privacy. In fact, without security, privacy can't really exist in the digital world. And so we do need our friends in information security to help us maintain privacy. Principle six is visibility and transparency, keep it open. And in short, what that means is don't create a black box or avoid creating a black box. Don't create a situation where an individual inputs their data, it goes into some system that is completely opaque to the end user, and it spits out some data. And the individual has no idea exactly how that data came to be. Because the mechanisms, the algorithms inside that software application just are not divulging those secrets. And so we're promoting visibility and transparency in principle six. The last principle that is within the PVD is respect for user privacy. Keep it user centric. And this really is part and parcel with the movement towards user centricity in software development and to develop applications that are maximally usable to the individuals that they're intended for. If you add privacy to that usability equation, then what you're saying is that you want to make sure that individuals don't have to search hard to determine what privacy options they choose or deny. And they are presented with an application that outwardly expresses the value of the relationship between the app developer and the user of that app. So those are the seven principles of privacy by design.
Tim Bornholdt 22:33
Well, there is a lot to unpack in all of that. I think the one that I wanted to touch on was the one that you actually brought up and said you want to talk on was number two of the privacy as the default setting.
Adam Stone 22:45
Tim Bornholdt 22:46
Whenever we're talking with clients about settings pages, you know, it's always funny when you get to a point in designing software, and people say, "Well, we'll just make it an option and people can choose one way or the other." It's like, there's some stat about only 10% of people ever even open a setting's screen of an app and look to see what the settings are. So as it relates specifically to privacy, you can extrapolate that and say, you know, if you aren't private by design, a lot of users aren't going to go in and it's gonna be people like you and me that understand it. Most software developers that understand privacy are going to go in and make tweaks as necessary. But if you aren't thinking about privacy at the forefront and making the software as private as possible from the get-go, then most people aren't going to go in and change that.
Adam Stone 23:40
That's right. And this is my favorite personally. But it is something that is, I will acknowledge, is very, very difficult for software developers to reconcile. That is because a lot of apps, as we know, are released to the public for free, quote unquote, and because those developers need to recoup the costs of their development, but also, you know, make a living off of these apps. That is really antithetical to the notion of developing a system that is private by design that doesn't collect data by design. Because we need to make money. We need to make a profit as we are releasing these apps to the public. There are a significant number of developers out there that are really seeing the value of trust and authenticity as the key to standing out among their peers and among their competitors, in effect, giving them a competitive edge by virtue of these design choices that outwardly express this respect for an individual's privacy right off on the get go. And if you'd like I could give a great example, a real clear example of how this plays out in the real world.
Tim Bornholdt 25:12
Adam Stone 25:15
Almost everybody has a mobile phone nowadays. Like many people, about five years ago, I had really resisted buying into, for whatever reason, Apple iPhone products. I just didn't want to do it. And so the cell phones or the mobile phones that I was using back then were obviously the competitor or the competing version, which in this case was Android. And I stuck with it from that point on. At some point, I don't know, 2, 3, 4 years ago, I was convinced to move to an iPhone. And so I did. I went with the iPhone. And as I started opening it up and looking at some of the settings, I was surprised by something. Apple, as a default setting for many of the sorts of applications and utilities and other things that make an Apple iPhone an iPhone, many of the most data intrusive features are, how would I put it? The leakiest aspects of interacting with the cell phone, those leaky potential or points of potential leakage were turned off by default. In other words, I needed to opt in to have Apple basically execute some of the features within the phone that were leakier than others. And that sent a huge message to me. That told me that Apple has actually thought about this privacy issue, and has embodied that whole notion of outwardly expressing respect for an individual's privacy through these, you know, really small signals, but they were strong enough for me to pick up on. And really, when you take a look nowadays, at the default controls that are in place when you buy a new Apple iPhone, and you compare that with its competitor, you'll notice a complete difference in the philosophy behind how these phones are set up. Apple iPhones have much stronger privacy controls by default than Google Android phones. And Apple recognizing the value in that differentiator actually created a commercial that touted how strong their phone was with respect to data privacy controls. And the implicit message that Apple sent is, you know, look, our competitor, that nasty Google company, you know, they take all of your information. Their phones are leaky as all get out. So you should buy our phone because we help protect your data. We don't leak as much data as our competitors do. So they were literally using that as a market differentiator. I thought that was brilliant.
Tim Bornholdt 28:36
Adam Stone 30:51
Absolutely. I mean, what you're seeing is an application not only of privacy as the default setting, but you're also seeing the use of a positive sum development attitude versus a zero sum. Because if we use the Apple phone, the Apple iPhone, with these settings, we keep the settings on. In other words, we do our best to keep the phone from leaking, that does not materially impact the functionality of the phone itself. Yes, it creates a few impediments to convenience and being able to instantly share certain bits of information. But if you are willing to balance that with your desire for privacy, I think the notion that Apple iPhone is operating on a positive sum versus a zero sum design approach is absolutely brilliant. And I should mention the folks that put themselves out there as being more attentive to data privacy, they, of course, expose themselves to criticism. I would argue that those organizations that are authentic and that express themselves in terms of authenticity are the ones who are going to win the day, whether a breach happens or not, because we are human, and bad things happen, mistakes happen. And no system is entirely foolproof. It's those organizations that come out right out of the gate and say, "Look, we screwed up. We apologize. And here's our plan to make amends, improve our system, shut down this, change this, whatever it takes." We don't see that often enough unfortunately. What we see is that when mistakes happen, whether a company puts itself forward as being a privacy conscious or not organization, when a bad thing happens, when a mistake happens, companies tend to circle the wagons and operate in this sort of cone of secrecy as it were. And that creates a suspicious public, because all sorts of, you know, when folks don't admit their faults right away, well, that naturally causes us as humans or as in this case, customers of a particular company to say, "So what are they trying to hide and why?" And it allows us to sort of imprint upon that organization a real sense of distrust in the way they do their business. I've also read, though I don't have the evidence in front of me to share with you, that the millennial generation really, really values authenticity. And if an app developer is targeting that audience, then they ought to really think clearheadedly about this notion of positive sum versus zero sum design approaches.
Tim Bornholdt 34:22
Yeah, as a millennial, I don't speak for every millennial, but I can certainly say that it's another reason why, it's a notch in Apple's belt for being authentic when they do own up to those issues. So one thing around privacy by design, if I am a product owner, and I'm looking to have somebody come in and build an app for me and I really do care about privacy, and care about privacy by design, are there tips, or any sort of ways I can tell that whatever team I'm hiring to come in and build out an app does care about privacy by design?
Adam Stone 35:04
I think that's a great question. The principles, the seven principles, are just that; they're principles. And so they're very high level statements. They're meant to be interpreted relatively broadly. So that they, you know, can be actually implemented in real life. I would suggest that as app developers, or excuse me, as app owners are looking for development help, that they should take these seven principles and frame them as questions to their potential app development partners, and get the responses back from those development partners. So a question might be, "What is your design approach to the settings within a system relative to data privacy and security?" And just leave it open-ended like that, see what the developer comes back with. I think that that would signal a lot for the owners of apps and eventual apps is just to flip these principles into open ended questions.
Tim Bornholdt 36:15
Yeah, I agree. Like, if somebody came to me and asked me about that specific question, I almost feel like it would be, I would go off on a big rant, I guess. Because I think you can tell the people that do care about it. If you can just tell the passion, it goes back to the authenticity point. If somebody actually has thought about this stuff and cares about it, you're gonna elicit a response out of the developer. Where if, if they asked the question, and they just say, "I don't know; what is that?" That's a clear indication as well.
Adam Stone 36:51
Yeah, and, oh, absolutely. That is a flag and, you know, that ought to factor into the decision on which sort of partner you choose.
Tim Bornholdt 37:02
Switching topics here a little bit. When we're talking about incorporating third party dependencies, I mean, there's so many routes to go down. I guess one question right out of the bat is when we're talking about incorporating a third party dependency into your app, what does that even mean? What are we talking about with that?
Adam Stone 37:20
Well, you know, in today's world, we are highly dependent on other organizations who have established systems, products, platforms, controls, to build whatever it is we want to build. If I am a small business owner, and I have an idea for a new app, my inclination, because I'm a small business owner, because I lack the resources to spend all the money necessary on research and development and all that comes along with that, I am going to be looking for established organizations that have already developed the platforms and the protocols and the programs that I can use as essentially a platform to build my application, sort of as a building block approach, which is substantially less resource intensive than it is to try to create something on your own. And so with that in mind, we are oftentimes highly dependent on more than, at least a couple, if not many, many more outside parties, helping to enable whatever it is we're trying to make happen within our software application or web application. And all of these parties that you rely on, of course, they're all third parties. In most cases, they're all their own separate corporations with their own separate profit motives and philosophies and value systems and so on and so forth. And when we have to rely on those organizations, we are relying not only on the fact that their product will, you know, make whatever it is we want to make work, but that they will do the right thing when they gain access to, let's say, personal data that we collect through our application. And that's where the real balance needs to come in. Ultimately it's just as I need to be able to trust that an app that I am interacting with as an individual will do the right thing, well, business owners need to trust that their third party service providers will in fact do the right thing when it comes to trading data in an ethical fashion.
Tim Bornholdt 39:55
And that, I guess, when it comes to that, it that's what it comes down to at the end of the day is trust, right? Is there any like, I don't know, I guess there's no real way around it at the end of the day. We're going to need to include some third party dependencies as app owners. I mean, I can't even build my software without relying on the third party dependency of Apple, offering a platform for me to develop an app, or Google or whoever. But there are certain third party dependencies that we might take on because it's kind of either en vogue or it's something that we think we need to have. And I'm thinking, for example, one of the easy, low-hanging fruit would be around login and authentication. Everybody thinks, well, we need to have login with Facebook. And we need to have login with Google. And we need to have, you know, the seven different abilities for people to log in through their social media platforms. But I think people sometimes add those dependencies in without really thinking through what that actually means. Are there steps or exercises that you might lead your clients through to have them explore what third party dependencies they're using in their apps and think through the implications of including those?
Adam Stone 41:12
Yeah, that's a great question. I would, before even thinking about the issue of how do we manage third party risk, I would recommend that app owners first just discover for themselves and actually write it down on a piece of paper, to make it real for yourself to determine where your parameters are. What are your values as a business owner? What is your objective for your business? And I would contend that if one of those objectives is to maintain the trust of your clients, then that really informs the approach that you take with respect to finding vendor partners, whether, you know, it's software developers or Apple or any of the myriad applications that we use for various bits of functionality in our web apps. With that stated, there are a number of platforms, number of frameworks in place to help us manage third party risk. Unfortunately, for small and mid-sized businesses, these platforms and these frameworks oftentimes come with a cost. And it's not only the cost of actually buying into a third party risk management framework, but it's also the cost of the opportunity, cost of stopping for a minute, and taking the time to assess the risk of your relationships. I recognize that as an app owner or a future app owner, one of your primary motives is to get that app out there as quickly as possible. And so anything that stands in the way of that creates friction, is difficult. I understand that. And so, again, I go back to, what is your value proposition? What are your internal values? How would you want to express those? And then let that inform your business decisions on how much time and money you want to invest in properly managing third party risks.
Tim Bornholdt 43:25
That's fantastic advice. I think a lot of times when you are a startup and you have an idea, and you go forward, a lot of times... I know when I started my business, it wasn't like the first thing I thought of. The first thing I thought of was how can we make money. And you kind of rely on your own internal core set of values, I suppose, as it were, when you do that, but then as a business and as your organization grows, you actually do need to think about what are your core values, and why would you include, like you said, trust and taking care of users' information with respect. That kind of lowers the amount of dependencies you might want to rely on because, yeah, if you throw in an ad network into your app, and all of a sudden now you're not knowingly leaking that information, but every single ad platform you include in there is just another potential vulnerability for you. So I think that does make perfect sense to me.
Adam Stone 44:27
Well, and it really does. This is not an easy... I recognize that this is not an easy decision for app owners to make. Because there's really this tug of war between wanting to do good on behalf of your clients and future clients, and maybe society as a whole. But then you have to put food on the table at the end of the day. And how do we do that in this new world of app development? Well, especially if we're offering these apps for free or at a substantially subsidized price, well, we need to make our money somewhere. And so we do that by, you know, reselling, effectively reselling the data that we collect through the use of our app to other parties. And so it's a real, it's a real balancing act. And it's not for everybody. I recognize that. This sort of approach probably resonates more for organizations that recognize, that acknowledge that whatever the app is that they're building, is something that truly rests on a foundation of trust. And if that trust is chipped away in any fashion, that can be an existential crisis for that organization. It's these sort of organizations that I think the message will most resonate with.
Tim Bornholdt 45:53
Man, there's so much we could talk about with privacy. And we can really get down the rabbit hole with it. There's one last topic of conversation I wanted to cover in this realm. And it's more of a fun topic for me because it's so fascinating. I wanted to talk about, so we're recording this in February of 2020, and there's been a lot of news back and forth between the FBI and Apple, specifically with regards to end-to-end encryption and keeping users' information private. And just to give a kind of high level overview, and you can correct me if anything I get is wrong here, but from what we're seeing in the news reports is, you know, the FBI is saying that they have a phone that has potentially relevant information to a terrorist investigation, and they want access to it, and Apple is just not giving it to them. Apple's story is, we cooperate with you all the time. They did something like 7 or 8000 requests for subpoenas that they responded to and helped where they could, but Apple really is hammering home they want to be secure and private. And part of that is encryption. And what the FBI is asking for is a quote unquote backdoor into people's phones, and what Apple is telling them is that's just not possible with math and with encryption and whatnot. It's just not a thing that can happen. And so there's this kind of epic standoff going on in the news. And as somebody that's entrenched in this world of security and privacy, like you are, what are your thoughts around this whole issue? Like, I could probably guess where they're at. But I just think it's such an interesting topic, and I think our listeners might be interested to hear, you know, what you think about that.
Yeah. I appreciate that. As you intimated, yeah, I do kind of land on the privacy advocate side of the argument. I do for my own reasons, though, and that's primarily because I am concerned that making it more convenient for law enforcement to gain access to private or what were considered to be private communications at the time, whether those private communications were against the law or not, in my view, is a different argument outside of the convenience issue of, you know, it's more convenient if the government has a backdoor into Apple's phone system versus having to go through whatever machinations they need to go through to try to hack quote, unquote, the system to get access. I know that I am quite concerned about overreach. And especially in this age where terrorism seems to be the the foundation of every argument that law enforcement uses to gain more and more convenient access to communications that we believed were private at the time. That's concerning to me primarily because of all the things that we've learned over the past several years. The Snowden revelations, the other leaks that have come out since then, these all sort of compound on one over the other to create a sort of worry that, you know, wow, it really feels as if our government is trying to, with little steps that often go unnoticed by the public, to kind of chip away at the protections that app developers have put into their systems to safeguard that data. I will say that I doubt that Apple made these decisions purely out of purely altruistic goals. They have a profit motive to consider. And that is that if they're buying public, fears that Apple will, without much friction, just kind of give away information upon requests by the government, well, then folks are going to stop buying Apple phones eventually. And so there's these two things happening, you know, the privacy advocates are hailing this as Apple standing firm against government overreach and or perceived government overreach. And you have other folks in the business side saying, you know, yep, this is a good business decision. We don't want to lose our client base because they perceive us to be essentially bowing down to government demands for data. And so, for me, I appreciate, I have an appreciation for the arguments that are going back and forth, and I as a citizen, I'm concerned that making it convenient for the government to gain access whenever they believe they have a need to investigate this or that activity, whether terrorism or otherwise, it's concerning. It seems to me that if we want a fair and balanced sense of justice in our society, that there ought to be some friction, there ought to be some hoops that government goes through to gain access to data and in fact, at least in law, we have these hoops. They're called subpoenas and warrants and things of that nature. Unfortunately, what we're seeing is a willingness of some organizations. I think the latest news we've heard comes from those web apps that do the genetic testing and I won't name the companies, but these folks that have the large genealogical databases also have large databases of genetic material. These organizations seem all too willing to share with the government, not on a case by case situation, but rather on just sort of this sweeping, you know, let's give you a whole batch of data, and you can decide, you know, within that data, what is actionable in terms of like an investigation or prosecution, and that really hits at the crux of I think what Edward Snowden was trying to expose was this notion of sort of blanket surveillance by the government is really degrading and threatens to degrade our understanding of what a democracy is.
Well said. I agree with almost everything you said because, again, it's so easy. I think the systems like you were talking about subpoenas and having to go get a warrant, those systems were put in place for a very specific reason. And we're at this point in society where information moves so fastly that, you know, law enforcement also wants to move just as fast as everybody else can move, because we all have that capability to, you know, I can pull up my phone and send a message in 12 different ways if I wanted to talk to my wife right now, for example. But if law enforcement wanted to crack all of those, or get access to them, you know, they'd have to go down to the courthouse and file a subpoena, and it's not as fast as maybe they would like. But I think like you said, as a private citizen, I really much appreciate having my privacy, especially knowing how easy it is to lose it, and how much easier technology is enabling that. So I really appreciate you taking the time to come in today and speak with us, Adam. Where can people learn more about SDS? And how can people get in touch with you if they have more questions about privacy?
Adam Stone 54:22
Well, thanks, Tim. I appreciate it, and I am really glad that you had me in today to take a little bit of time to talk about an issue that I am indeed very passionate about. Folks can find my organization at trustsds.com. That is Secure Digital Solutions's website. Folks can also look me up on on LinkedIn. I am, despite being a privacy person, I use LinkedIn quite aggressively. So like, I'm not a perfect person myself, but I am able to make my own choices in terms of which secrets I choose to give away and which secrets I choose to withhold. And with respect to LinkedIn, I am making choices to disclose and so folks can find my disclosed secrets on LinkedIn under Adam Stone, privacy, how do I bill myself, data privacy and security executive. And I would love to chat with anybody who is interested in this subject and also interested in having some guidance around how to build a software application with a privacy by design framework or the mantra with which we weave into every stage of the software development lifecycle.
Tim Bornholdt 55:54
I love it because I think to your point about using LinkedIn as a privacy expert, you're still using it. It's a spectrum, right? I mean, it's a never ending pendulum that we're swinging back and forth between, you know, having everybody have all access to all of our information and having no access to any of our information. And, you know, as long as I think people understand what they're trading by giving up their personal information, if we're in a democracy, we can choose to do that. And I think it's important as app owners also to realize that it's a spectrum and we just need to be constantly vigilant to make sure that we're making all the right choices.
Adam Stone 56:33
Absolutely. It really comes down to control. That's what privacy is, it's the ability to control one's innermost secrets, which are frankly the diamonds, you know, our psyche that we are trying to protect and sometimes we like to give diamonds away, other times we like to keep them and the ability to control that sort of selective disclosure is what it's all about.
Tim Bornholdt 57:04
Couldn't agree more. Thank you so much for joining us today, Adam.
Adam Stone 57:07
Thank you, Tim. Great to talk with you.
Tim Bornholdt 57:09
A big thanks to Adam Stone for joining me today on the podcast. As he said at the end there, the best place to get in touch with him is LinkedIn. So we'll put a link to his profile in the show notes. Those show notes can be found at constantvariables.co. You can get in touch with us by emailing Hello@constantvariables.co. I'm @TimBornholdt on Twitter. And the show is @cv_podcast. Today's episode was edited by Jordan Daoust. One quick favor to ask of you, if you've got two minutes, please head on over to the Apple podcast app and leave us a review. I'm sure you hear that all the time if you listen to a lot of podcasts, but it really does help our show rate higher in those charts. So just head to constantvariables.co/review. And we'll actually just launch the app right there and take you right into the app where you can leave us that stunning review. This episode was brought to you by The Jed Mahonis Group. If you're looking for a technical team who can help make sense of mobile software development, give us a shout at JMG.mn.