89: The Art and Science of Data with Solomon Anderson, Infrastructure Engineer
Published August 17, 2021Run time: 00:55:30
Think globally. Act locally. Infrastructure Engineer, Solomon Anderson, joins the show to chat about how data operates, the art over science of balancing new features with supporting existing infrastructure, and the vast environmental impact of something as miniscule as open browser tabs on our laptops.
In this episode, you will learn:
- How data operates in private institutions vs the public sector
- What computers can do with data that humans can’t
- The price of “free” social networks
- How to balance pushing new features forward alongside the need to support existing tech
- Small infrastructure steps we can take to reduce tech’s environmental costs
This episode is brought to you by The Jed Mahonis Group, where we make sense of mobile app development with our non-technical approach to building custom mobile software solutions. Learn more at https://jmg.mn.
Recorded July 20, 2021 | Edited by Jordan Daoust | Produced by Jenny Karkowski
Show Links
Solomon Anderson on LinkedIn | https://www.linkedin.com/in/solomonranderson/
Solomon Anderson on Twitter | https://twitter.com/1991DBA
JMG Careers Page | https://jmg.mn/careers
Connect with Tim Bornholdt on LinkedIn | https://www.linkedin.com/in/timbornholdt/
Chat with The Jed Mahonis Group about your app dev questions | https://jmg.mn
Episode show notes | https://constantvariables.co
Leave a review on Apple Podcasts | https://constantvariables.co/review
Twin Cities Podcast Hosts UNPLUGGED event details | https://emamo.com/event/twin-cities-startup-week-2021/s/twin-cities-podcast-hosts-unplugged-part-1-okd8rN
Episode Transcript:
Tim Bornholdt 0:00
Welcome to Constant Variables, a podcast where we take a non technical look at building and growing digital products. I'm Tim Bornholdt. Let's get nerdy.
A few new housekeeping items. First, we're trying something new with ratings and reviews for our show. You know, we know these asks are annoying, but they're very valuable to us. And as a thank you for taking the time to rate and review our show on Apple podcasts, we will in turn give you and or your company a shout out on the show. Leave a rating and a review on Apple podcasts and get free advertising. It's that simple. Even simpler, we've put a link in the show notes to take you right there.
Second, we are hosting our first in person event. If you like podcasts, assuming since you're listening to one you do, and you live around here in the Twin Cities, join us on Tuesday, September 21 at the Machine Shop in Minneapolis during Twin Cities Startup Week. We are bringing together a group of local podcast hosts to learn how they're using podcasting as a sales and marketing tool. I'll be flipping the mic on these podcasters so bring your entrepreneur, marketing, sales, tech, and any other burning off the wall questions you might have and ask them anything. You can get your ticket and learn more about the event, Twin Cities Podcast Hosts Unplugged, at TwinCitiesStartupWeek.com. We will put a link to that in the show notes as well.
Today we are chatting with Solomon Anderson, a professional infrastructure engineer at a Fortune 500 company. Solomon joins the show to talk about the life of an infrastructure engineer, how data is stored and handled, balancing new features with supporting existing infrastructure, sustainability within large organizations and so much more. So without further ado, here is my interview with Solomon Anderson.
Solomon, welcome to the show.
Solomon Anderson 2:08
Hey, Tim, good morning and to the audience, dood morning, afternoon or evening, wherever you happen to be hearing this. Thank you so much for inviting me. I'm happy to be here.
Tim Bornholdt 2:16
Absolutely. Really excited to have you on. I'd love it if you gave our audience a little bit of background on yourself and what you do for a living.
Solomon Anderson 2:24
Yeah, definitely. So first and foremost, my name is Solomon Anderson. I'm an infrastructure engineer from the great city of Minneapolis, Minnesota. I like to tell people that I'm from just the Twin Cities in general. I grew up pretty much all around the Twin Cities. So for the listeners out there if you're from the Twin Cities area, some of the parks I grew up around are Powderhorn Park, Brackett Park, Jordan Park in North Minneapolis, Peavy Park in downtown Minneapolis. So now King Park here in South Minneapolis. So got to experience all the city's greatness. And yeah, work for one of the major retailers here in the Twin Cities, doing infrastructure engineering, which is a fancy sounding title, that pretty much just means I help keep data secure.
Tim Bornholdt 3:21
Right on and a very important job at that. I know a lot of our listeners are from the Twin Cities, so I'm gonna put you on the spot, which parl was your favorite to be around?
Solomon Anderson 3:32
Oh, well, I'm biased and definitely Powderhorn Park. I consider myself a Powderhorn Park native. I grew up playing for the Eagles there. So I would say that's my home park.
Tim Bornholdt 3:45
I love it. So prior to your current role, I know you worked with Minneapolis Public Schools, and then you shifted with working with MNIT services with the State of Minnesota. How has your career shift been from going from, like the public sector into more of a private sector?
Solomon Anderson 4:02
Oh, well, you know, it's a different way of doing business, right. The public sector paradigm, you know, you're working with public funding, taxpayer dollars. So, you know, the money that you and I spend in sales tax, and also just in taxes in general to the State, that more or less compromises the State of Minnesota's budget. So it's a little bit different on the private sector where things are, you know, oftentimes privately funded or funded by shareholders. And so just a different way of doing business. Still a lot of the same problems that you're dealing with on a day to day, you know, keeping data secure. At least for me, you know, I don't kind of distinguish between the different types of data in terms of, you know, whether or not it should be secure. I think everyone's data should be secure, right? I think we as individuals have a right to have that expectation of if we are entrusting an organization with our data that that data is going to remain, you know, secure, available for the people who need it. And, you know, not available for people who don't need to access to that for whatever reason, whether it's malicious intent, or just not needing access to that to their jobs. I'm a big proponent of data security, and just, you know, data privacy in general.
Tim Bornholdt 5:33
Are there any, like, significant differences between protecting the data of, like, you know, a large retailer, like the one you work for, versus a public, you know, government state ones? Like, I couldn't imagine that there's a whole lot of differences between the two, but are there any, like fundamental things that you have to keep in mind with one or the other? Or is it pretty similar?
Solomon Anderson 5:56
Oh, I mean, how much time do we have, Tim? There are so many different kinds of data. And I guess to the listeners out there I'll start from the beginning, right, because I started my career off in the data realm. I kind of always wanted to be in and around data. It's something that I have a, you know, pretty big passion for. Data is more or less in my eyes, the accumulated experiences of others, right? Let's go back to the first databases, which weren't even on computers, right? The example that I like to use for a lot of people when I'm talking just general databases are Rolodexes. Rolodex, for listeners out there who maybe don't know what a Rolodex is, because, you know, I talk with students quite a bit. And a lot of students have no idea what a Rolodex even is. So, for those of you who don't know, a Rolodex is a device that more or less held note cards. And these note cards would contain things like contact information, so someone's phone number, or their mailing address, you know, because in these days, they didn't necessarily have access to email.
There's so many different varieties of how data is maintained and stored, like we have the Rolodex example of, you know, contacts that you wanted to get keep in touch with in the past. These days, there are so many different varieties. So with the State, I was working on MNsurehealthcare.gov. And a lot of that data was private health information, or PHI. PHI is a subset. It's a specific type of data that deals with healthcare. And the reason why we call it PHI is because that type of data is subject to different rules than, say, PCI data, which is payment card information. These are a lot of acronyms. So for the audience out there, sorry, if I'm losing you.
Tim Bornholdt 8:12
Oh, no worries.
Solomon Anderson 8:14
More or less, you know, there are a variety of different kinds of data that organizations might come across or maintain or be responsible for. And oftentimes, that data is subject to different rules. You know, a lot of people, you might have heard of HIPAA, right. HIPAA is a legislation that primarily focuses on PHI, so private health information, but it also covers PII, which is personally identifying information. And so there's a lot of different types of data in different caveats and different rules and caveats that that data might be subject to. So there are a lot of different nuances. And you know, how you perform certain business tasks when you're, you know, working with health data versus retail data. There's also different configurations that you have to keep that data in versus a retail environment. So there are a lot of different nuances still allow the same problems of, you know, making sure people have the access they need, making sure that people who don't need access, don't ever get access. But how you get there can look very different depending on which avenue you're coming from.
Tim Bornholdt 9:39
So let's say I'm building an app, and I have an idea for what the app should do. But I don't think in terms of data, right? So someone comes to you with an idea and they say, I want this built. How do I start going about the data? How do you think about what sorts of information needs to be stored and how you go about protecting it?
Solomon Anderson 10:02
Oh, that's a great question. I think it really depends on what you're trying to do. Especially in the context of mobile apps, there are a lot of different things to think about. So, let's say, how is your app going to make money, first and foremost, right? A lot of apps do that or a lot of apps on the app stores are free. And so one of the primary ways that developers can make money is using ads, serving ads to the users. And so how would this potential app developer plan to use MAIDs? Or what are called mobile advertising IDs, which are essentially unique identifiers for every mobile device out there. And so, you know, Tim, let's say, this is your app that we're building. And the first thing I'm going to ask is, What are you trying to accomplish with your app? What are users going to be giving or putting into your app? And what is the expected outcome for them? Is it you know, a social media app where you're providing your, you know, biographical information, in hopes of connecting with others who maybe have the same interests or, you know, maybe went to the same high school? Or is this a app like, you know, Duolingo, where you're trying to enable people's learning, and you're collecting things like account information preferred, or, you know, spoken languages or languages they want to learn? There are just so many, so many different, you know, ways to kind of think about the data that you're not just asking for from your users, but also the data that you're maintaining on your end and how you're giving back data to the people who might be using this application, whether it's through, you know, advertisement, whether it's through collecting log information.
So you know, if you've ever had a Microsoft Office program become unresponsive, and you have to, you know, control alt delete to, in that process, sometimes what will happen is, a bug report will pop up, and it'll ask you, Hey, do you want to send this information to the developer or the vendor who, you know, is possible for this software to help them with their troubleshooting? That's also a form of data that the user, the end user, is sending back to the provider of that application. And so how are you going to make that connection happen? And how are you doing to store that data? Are I guess the first questions that pop up to me, but also, you know, what is your plan for keeping this data secure? Where are you going to store this data? Is it going to be on a personal server in, you know, your basement that has maybe scrupulous security? Or are you going to put it in the cloud? And are you going to make sure that you're, you know, cloud bucket mothers to Amazon, AWS infrastructure instance or some other kind of technology. Are you going to ensure that it's configured in a safe and secure way? The primary focus for me is mostly always gonna boil down to, you know, the data that you're collecting from consumers, how you're keeping that data secure, and what your plan is for, you know, maintaining and keeping that data secure, because that's an ever evolving target. It's a, you know, a constant problem that not just mobile app developers, but large organizations, like my own, are constantly dealing with.
Tim Bornholdt 14:02
Yeah, and it's one of those questions, too, that I get asked a lot is, people will ask some of those questions of, how you keep data secure and how you go about it. And it's asked to add a really high level, right, it's not asked at a specific level. So you kind of have to just generally say, you know, ask those kind of leading questions to get to specific answers. But I think that what you've been saying makes a lot of sense of, you know, but before we can really answer, like, you know, how data should be protected, we need to really get an understanding of a lot of the business concerns. It's not just about the the app itself. It's about what you're going to do with that data and how it's going to move from place to place, and every single time your data moves from one place to another, you know, for example, if you have something in your app that can export data, you know, that's a concern. Because someone can just dump all their data out and email it to somebody else. And now we can't guarantee the security of that data, right. So there's a lot of like different facets to keeping data protected. And it's really interesting thinking about depending on your problem like if you're in the medical space, you have to worry about HIPAA and PHI and PII. If you're collecting credit cards, you have to worry about PCI compliance. There's so many acronyms, like you said, and different buzzwords that when you build an app, it really helps to have someone like you that has gone through this wringer a few times, and kind of has an understanding of the possibilities that you're going to, you know, run into, and then also ways for dealing with those things as they pop up.
Solomon Anderson 15:40
Yeah, it's having someone like me can help. But, you know, I'm definitely not a wizard by any means. It really does take a team of people who not just understand the data, but also the security implications of, you know, that data. We don't have to, you know, think too far back in the headlines to, you know, come up with relevant examples. Colonial pipeline is a huge, huge, you know, story in security right now. But also, I think back to, you know, the Experian, or was it Equifax, I'm sorry. I can't recall if it's Experian or Equifax.
Tim Bornholdt 16:19
It was one of the two.
Solomon Anderson 16:20
One of those two, one of those two, you know, consumer credit reporting agencies. They had a large, you know, data breach as well. And we're still dealing with the effects of data breaches like that. And by the way to the audience out there, if you haven't already frozen your credit reports at all three bureaus, please do that immediately. Because one, it's free to. It'll help keep your credit secure. But it's also just a good practice. And, you know, as new technology develops and is released to the general public, I think we're going to continue to see these type of data breaches. And it's not just going to affect, you know, us as individuals. Tt will impact, you know, potentially the supply chain. We saw that with the colonial pipeline hack, where there were large swaths of the southeastern United States that just did not have access to gas. And it wasn't because there is a gas shortage, you know, so to speak. It was just because the supply chain was not able to deliver the gas where it needed to go. And that's one small, well not one small, one example of, you know, the scale of how large this problem can be, and how, you know, it can affect us on a day to day world. Because, you know, for a lot of these security breaches, it is something that we as consumers hear about, but it's not something that's real to us, because we don't experience the side effects and the byproducts of, you know, what this hack may be, unless you're one of those consumers who was personally impacted by, you know, maybe your payment card information getting leaked, and someone being able to, you know, charge your credit card. Without those kind of, I guess, personal experiences to where you're saying, Oh, wait, you know, this hack does affect us and this is how. It's hard to really understand not just the breadth of, you know, the security problem, but also how quickly it can turn into something that affects your day to day life.
Tim Bornholdt 18:51
It's always fascinating to me that problem of like immediacy and relevancy. I mean, you can even look at like the pandemic we're going through, where so many people, you know, continue to this day to say that COVID isn't real, or that people don't get that sick or anything like that. And it's just, you don't have to go very far into Reddit to find examples of people who, you know, back in last March, were complaining about the shutdowns and saying, It's just another version of the flu and it's not that big a deal. And then fast forward 18 months and they're, I'm in the hospital with oxygen being pumped into me, please pray and stuff. And it's just like, people really tend to underestimate the impact of things until it actually impacts you.
And the thing with technology, the way it relates back to this specific topic is people think that data, people really undervalue the importance of data and how a company like Facebook, you know, why is it valued at several billion dollars. It's like, there's a very specific reason for it, and it's like, you may think that you posting your update at getting a sandwich at a restaurant or something isn't in its of itself particularly valuable. And you may be right about that. But when you take, you know, millions of people posting millions of things all at once, and you can start to run patterns and run different queries against that data and sell the results back to different people, it's like, your individual contribution to the data may not be significant. But when you take all that data in mass, it's a lot. It can lead to a lot of big problems. So if you're thinking about building software, and you're thinking about getting hit at scale, you know, that's why you need to be thinking about these problems is because it's bigger than just one individual. It's people can take this data and even, you know, find ways to match pieces of data that you wouldn't think are relevant, like different websites you go to or kind of following you around the internet. It's like, these companies and anybody that has that kind of scale can piece together information about people that they didn't even know about themselves. So that's why we hammer home these points about data and security so much, and especially for people listening to this that are thinking about building mobile software, you have to start to think at a bigger scale and think at, you know, what can computers do that humans can't do?
Solomon Anderson 21:19
Yeah, yeah. And, you know, it does not take a whole lot of data to get some pretty interesting insights from someone, you know. Going back to the mobile device landscape. I mentioned earlier, that, you know, mobile devices have these things that are called MAIDS, mobile, I forget what these, I think it's mobile advertising ID. I think that's the acronym, forgive me if I'm mistaken. And so these are more or less unique identifiers to each and every cell phone out there. So if you happen to have a smartphone that has any type of ad whatsoever, your phone has a MAID, a mobile advertising ID. And while the data that each individual app may collect can be, you know, anonymized or pseudo anonymized, it doesn't take a whole lot of searching to find, you know, ways to combine that data.
And so I'll use an example from, I forget the name of the app. But there was a prayer app that a number of people downloaded and had installed on their devices, and it was a free app. So it served ads to its user base. And, somehow, some way, the devices that were being served ads to this app, that list of that data set got leaked. And by combining that, you know, supposedly anonymous data set with some of the other data sets that exists out there on the web, it became a trivial matter to identify not just the people who were using the app, but also the places that they were going, the property that they may have had in their name. And it's almost shocking to kind of discover how quickly and how much data a company like Facebook might have on you. And there have been a number of examples of, you know, people finding out things about themselves that they might not have known had it not been for the type of ads that they were receiving. I've lost track of how many people, how many headlines I've seen of people discovering things like a pregnancy from, you know, getting served ads about baby formula, or someone, you know, finding out some, you know, person with some crazy information about themselves or a family member, because of the ads that they were seeing. And it's because, you know, some way, somehow, some advertiser out there was able to gather enough data about a consumer to predict, Hey, this person is likely going to, you know, be pregnant within next year or so. Let's serve them ads or this person is likely to travel in the next two to three months. So let's serve them, you know, advertisements about travel. There's so many different data points out there that it only takes a few, you know, connections between different data sets to figure out some really, truly Interesting and almost frightening things about the average everyday user.
Tim Bornholdt 25:04
Yeah, I could go on and on about this because it's it's something that I think we as technologists really intimately understand, but it seems like maybe the world and the general public at large is kind of coming around to this idea, but I still preach it as much as I can, because there's always somebody who doesn't know. And I'd rather people be aware of when you're posting on Facebook, it's free. No one's paying for Facebook, in a way that they understand, you know, how they're paying for it. That's the part that appalls me the most is I really don't think people understand, like, when you use a service, like Facebook, or Twitter, or LinkedIn, or any of the free social networks, there is a price you're paying in the long run, and I think at least you should be aware of what that price is, and have consumers have the understanding of what they're doing before they, you know, just post because it's free.
Solomon Anderson 25:56
Yeah, and not only are you paying a price, but oftentimes you are a product in those, you know, paradigms, whether that's in the form of your data being sold in, you know, in mass, of course, with you know, 1000s of other people, or, you know, in different contexts is not just being a product, but also delivering value on behalf of, you know, these organizations. Not that it's a good or a bad thing, but I'm with you. I think people should be aware that it's happening. And, you know, as more and more people become aware, they can make informed decisions, and maybe they say, Hey, you know, I'm okay with that exchange of information. I'm more than happy to let a Facebook know about, you know, what my day to day activities are in exchange for having quick and easy access to communicate with family and friends. You know, it's not ,I think, for me, it's up to the individual consumer to decide what their level of comfort is. But I think, I agree with you that we both as consumers, and also I think, on the organization side, should do a better job of making people aware of what's really at stake. And it is not, you know, always just what your day to day activities are, it's also your location, it's also, you know, property you may own, the car you drive. All of these things that are relatively small and minute in scale, but if you, you know, take a step back and kind of understand the picture, or all of the data and how many implications and I guess impressions that you can more or less predict from a consumer, I think the more people understand that side of things, the more, I'll say it's just a very nuanced conversation. So it's tough for, I think, consumers to know, because these aren't things that we talk about in schools for the most part, you know. While some students may understand what data is, they might not necessarily be thinking about how the data that they're providing to an app, or the information that they're sharing over, you know, a Instagram chat, could, you know, down the line, come back and show up in the form of them getting, you know, a specific ad or, you know, having their data included in a data set. There's so many different, you know, dynamics at play that it's very difficult. It's a very nuanced conversation. And so it's very nuanced.
Tim Bornholdt 29:05
Yeah, it's nothing will solve here today. I think it's more of just, you and me making people aware, you know. There's a conversation that needs to be had that isn't really being had. And it kind of comes down to, you know, critical thinking and just, you know, the basics that they might not teach about how data can be mined and aggregated and pieced together in certain ways in school, but, you know, they can certainly teach critical thinking, and all you have to do is think like, Oh, this service is really valuable. I wonder why it's free.
Solomon Anderson 29:41
Yeah, but it's, you know, it's one of the things. I think it's challenging not just for, you know, the young folks out there, but also, you know, we've seen in Congress as well. There's been a lot of debate around consumer protection, you have GDPR, which is, you know, for people who aren't in the tech industry, it's legislation that deals with consumer privacy and, you know, tracking. And so for those of you out there who have, you know, an iOS device, if you've been keeping up on updates, you may have seen over the past year that you're getting notifications that are asking you, Hey, do you want to allow this app to track you? And that came as a result of that GDPR legislation and just consumer protection laws in general. And so we're starting to, slowly those conversations are slowly starting to come into, you know, the larger general conversations that are happening in the general public, but it wasn't until, you know, that update happened that people were asking, Well, wait, what's tracking? What is tracking me? You know, how are they tracking me? What information are they tracking? Are they just, you know, tracking the websites that I'm a part of, or are they tracking, you know, what, where, how I'm navigating through the web. And even just that distinction is, you know, a lot different from the cookie conversations that we were having about websites in the early 2000s. Because that's a whole different ballgame. And the methods that are being used to, I guess, extract that information from consumers is different, but also the ways in which it's being extracted are different too, you know. Back in the day, it may have been in the form of a consumer survey. Now, a lot of those consumer surveys are more or less social games that people play on, you know, you see the quizzes on Facebook, if you if you're a Facebook user. Or maybe even the polls on a, you know, site like LinkedIn. That data is all boiling up to something and you're oftentimes getting those surveys because you've been identified as a member of a population that either that advertiser or the survey provider wants to, you know, learn from and speak to.
Tim Bornholdt 32:24
Oh, man, I'm moving the subject. I'm changing the subject because I'm starting to get mad about all this stuff. And this show is supposed to be happy. And so we're moving it along. Yes, to button it up, there's a lot of shenanigans on social media. We'll say that.
So we were talking about cybersecurity, right, and talking about with building out infrastructure, if you're thinking about building out an app, you might have different questions on how that data should be secured. And you had mentioned that one of the things that you ask when you're going through that is how are you going to maintain it and keep it secure. So I know, as someone that's been building apps for other people for nine years, the thing that people want to talk about, is adding new features. And the thing that nobody wants to talk about, is maintaining what you've already got. And I think you can find a lot of examples across a lot of industries for why this is the case. I mean, people love the shiny new thing. But as somebody that does infrastructure engineering, you probably have a very specific viewpoint on deciding between, you know, maintaining some legacy infrastructure versus pushing along new features that need to be added on top of it. So how do you think through the balancing act of adding new features to your app versus maintaining what you've already got?
Solomon Anderson 33:45
You know, it's a fine balance, it's a balancing act, right. And it's more of an art than it is a science in my book, just because it can be very difficult to, I guess, remove those legacy dependencies. And I will go back to some of my earlier experience, and I won't call up any names for each specific organizations. But in my career, I've had to support a number of different, you know, types of systems and architectures. So I've worked with mainframe computers. I've worked with software like FoxPro, if folks out there remember what FoxPro was and is and it can be extremely difficult to find where that balance is. Because for a lot of these legacy applications, the reason why they are around for so long, is because it's tough to find replacements. There's a reason why the banking industry still uses a lot of mainframe computers. It's not because there aren't better computers out there. No, there are much more efficient computers. But those systems have stayed in place because it can be extremely difficult to mimic the way some of those older, especially analog systems, work. For those folks who may be in the finance industry, I'm sure you're well aware of all the complications that come up from something as simple as processing, right? Processing order of expressions, right? So like, how do you calculate the amount of interest on a specific account where compound interest is, you know, a factor. And for you and I, you know, we don't have to deal with the separate problems on day to day basis, but just having one, you know, decimal point off on one of those very large scale banking computations can make the difference between hundreds of thousand and millions of dollars. And personally, I don't want to have the, you know, I guess, having that worry in the back of my mind of, Oh, wait, was that zero supposed to be in this space or that space? Was it supposed to be a floating integer? Or was it supposed to be a real number? And, you know, just having to deal with problems on the level of thinking about the data types that you're using. Because, you know, numbers aren't always treated the same between different programming languages. And how do you solve for that? There are all kinds of just special cases and caveats that, you know, if you're building up a regular web app, you probably will never have to worry about but if you're working with consumer financial data, these are things that you have to breathe and bleed these, you know, very complicated, very complex programming paradigms. Because if you don't, it can mean, you know, the difference of millions of dollars, and also potentially your job. And so, for me, it's, I think, one, you should always have a contingency plan in place. Right. So what is the absolute worst case if, you know, a system that you depend on happens to break down? How do you go about fixing that? How do you recover from, you know, a catastrophic failure? Start there, step one. Step two, you know, is a new feature going to be the difference maker? And by being the difference maker, I mean, is it going to have a, you know, irreversible effect on the user base? Is it going to, you know, make things a whole lot better? Or is it going to make things a whole lot more secure? What is the value of this new feature that you're bringing on versus, you know, maintaining the old infrastructure, and ensuring that that infrastructure is secure. Because for a lot of these different hacks, and, you know, data leaks, think of like the Heartbleed SSL bug that came out. It's not all just, while a lot of these security incidents happen because of old infrastructure, sometimes it's also happening because you're introducing features too quickly. You know, going back to the Heartbleed SSL issue, you know, that issue existed only because people weren't up to date on patching. If people weren't up to date on patching, it wouldn't have happened. And so it's, you know, a lot of people tend to blame the old infrastructure on security lapses, but it can also be new infrastructure, too. So, you know, is this new feature going to be the difference maker? Is it worth the potential risk of having a security event? Yes? Or if the answer is no, then you know, oftentimes the answer of implementing that new feature is going to be no as well. But you also can't, you know, box yourself in and put off those updates for too long, because then you're at risk of having some of those older, you know, bugs and, you know, maybe it's a zero day attack, whatever the case may be. The longer your infrastructure is, you know, I guess available and public facing, the higher likelihood of it being compromised. That's just kind of the name of the game. And so, from there, you have to, you know, determine how often can you take outages, how, you know frequent do you want to do your updating? And how can you best I guess posture yourself from a security standpoint to mitigate and reduce the risk of those attacks happening to your organization? And, you know, it's, like I said, it's more of an art than science, because the answer is different for every organization, for every team, and for a lot of products, even within those teams.
Tim Bornholdt 40:28
And to me, it just seems like this kind of, like you said, it's art over science. It's just like, kind of an unanswerable dilemma that is a never ending constant when you're dealing with software is, you're gonna, if you keep maintaining your software, and you keep having users using it, eventually, you're going to hit a point where someone's going to be using your app on a 10 year old device that has all kinds of vulnerabilities. And it's like, how much effort do you want to put into having that one person be able to use your app, versus everyone that's upgraded to 10 operating systems newer? It's a never ending struggle that you and your team have to, like, take a calculated risk together. Because there's no such thing as like an impenetrable system, you know. There's always going to be the way in if there's somebody that's motivated to get into that system. So you kind of have to play it, I think a little bit of, you have to take kind of calculated gambles, you know, of do we spend the time to upgrade to this thing? Or do we spend the time to add in a new feature? And like you said, I think it comes down to listening to your users and kind of thinking through all of the business concerns in order to, you know, really come up with what the right answer is, and make it more of that art over a prescribed science.
Solomon Anderson 41:50
Yeah, yeah. Like you said, there's no such thing as perfect defense, right. And so, you have to find that sweet point of, you know, who your users are, what your use case is, and, you know, go from there. But even if there was, you know, a special magic system that you could just buy off the shelf, and you know, have it instantly secure your environment, your environment is only going to be as secure as people who are supporting it. You know, you can have the best security in the world from a, you know, a coding standpoint or infrastructure standpoint. But if someone within your office, you know, might potentially take a USB drive and plug it into a computer. Well, there goes your security posture right there. Right. And it can be something as small as that, you know. I think of examples like the succed. net worm, I think that's right. But if you look at you know, that example, that example came from literally a thumb drive, and you're talking about some of the most secure, one of the most secure facilities, you know, at least in that region, at that time, being compromised by, you know, what, $15 USB device. Having those threats as a constant, you know, potential attack vector is something that you're always going to have to deal with. You have to, like you said, have to decide within your team, you know, what's your priority? How much time do you want to spend on supporting the old infrastructure versus, you know, building out those new features, that new functionality, and getting new users or, you know, introducing new features for the user base that you already have? Because it's, you know, a different answer for everybody, depending on your team, depending on what your goals are, so on and so forth.
Tim Bornholdt 43:54
Last question for you here. I want to change gears one more time. And I know that this is something you've been thinking about a lot, but we at JMG, we've been really thinking about our digital carbon footprint. And, you know, thinking through, we've been redoing our website, trying to lower the emission costs that we've been able to calculate. And I mean, I know that tech's impact on the environment is something you've also been exploring. Is that something that you've been trying to include in your role where you're at right now? And where are you in the journey of like, kind of trying to think through what impact all of this technology and the internet at large has on our world?
Solomon Anderson 44:37
Yeah, I mean, it's a large question. And it's first and foremost, I'm just sad it's coming out towards the end of the episode because I wish I could spend a lot more time on it. But, you know, I'm a big proponent of thinking globally and acting locally. So like, you know, I think you don't have to look very far to see the effects of climate change, right? We had just over the past month flooding in places like Germany on a scale that we've never seen before. In Florida, we have buildings literally collapsing. And it's in part, you know, due to the impact of climate change, and it's something that I'm constantly thinking about, not just in, you know, my personal life, in the form of, you know, compost and recycling, but also in my day to day job. And, because I guess I get a unique perspective than most people, because a lot of people you know, I won't say a lot of people, but most people aren't spending their day to day in a data center, right. A lot of people probably haven't even seen a data center or been inside of one. Because you know, a lot of them are out there, you might, you might even be, you might have even passed the data center, and not even knowing it, because you might have thought it was just a regular warehouse, until you see, you know, 15 AC units on the top of the building. But, you know, it's something that's constantly in the back of my head, because the systems I support are on a such a large scale. And we're talking, you know, not gigabytes of data, but terabytes or petaflops of data, in some cases, and all that data has to go somewhere, right. It has to be stored somewhere. And not just stored in a way that it's secure, but also stored in a way that is accessible to people who need access to it. And that costs, not just money, but also resources, right? There are, you know, 1000s and 1000s of dollars that go towards just the electricity and energy costs associated with running one of these data centers. And so that's something that I'm constantly thinking about, but also on a personal level, you know, the amount of time that we have a tab open in Chrome. If you're, you know, one of those people who happens to have a lot of Chrome tabs open at any given time, and I'm definitely one of those people, I'm trying to be better at it. But you know, I might have 50 tabs open on a given day. And while I'm not actively, you know, working in those tabs, just having that tab up, is costing money, right, or costing resources, because in the background, you might be refreshing, whether it's just the page, or maybe the ads on that page. And that, you know, on a small individual level, it's again, you know, like, a lot of this conversation might be minute, might be minuscule. But if you scale that up to you know, hundreds of 1000s or millions of people, then you, you know, can very quickly get into some just large amounts of data and electricity costs, and it's a lot to consider. And so I'm always thinking about ways to be more efficient, whether it's, you know, sunsetting systems that are being used by many people, maybe migrating, you know, a lot of teams, and you've probably experienced this Tim. If you're building an app or a new application, a team might get their own database, and they might really be protective of that database. And they might want their own database themselves. But they might not need, you know, their own database. Maybe the data that is for their app can live in a database with another application that, you know, doesn't have a large user base, and maybe they can share infrastructure, instead of having two separate instances that are, you know, taking the same amount of resources, relatively speaking. And so there are a lot of different ways that from an infrastructure engineering standpoint, that we can, you know, just take small steps to reduce that footprint, you know, trying to encourage behavior that isn't wasteful, whether that's in the form of, you know, you know, reminding people, that you'll be logged out automatically after, you know, five minutes of inactivity, just small, small steps that, you know, again, might not seem like much on an individual level, but if you build them up can have, you know, huge savings costs, reduce the footprint that you know, your organization or your application might be using. And just, you know, not being as wasteful as we have in the past. I think if we do just a little bit to reduce that waste, we could seal a huge impact on it. And if we keep making, you know, small changes like that, then we kind of get in that mode of thinking. Alright, how do I be as least wasteful as possible? And once everyone is kind of on that same page, I think we'll be in a good place.
Tim Bornholdt 50:13
It kind of harkens back to our earlier conversation about people not understanding their data and what the value is to that and what happens when you have a mass amount of data, what you can do to. It's similar to this sustainability stuff within technology. And if you're building an app, and you are going to have a scale, it's like, every little piece of code that you run increases the amount of energy that you're drawing to run it. And so the more compact, it's kind of like best practices, again, just going back to accessibility. If you think about the things that help make people's lives a little easier, that have disabilities or other kind of issues with working with technology, you make changes so that it helps them out. And then in turn, it helps out everybody. It's kind of a similar principle to sustainability within technology, where if you're operating, if you have an app, and it's just your app, then you know, you can monitor that and be safe with it. But if you've got 10,000 people all using your app at the same time, and you're wasting resources, you know, negligently, all that little bit of data adds up and adds up. And that's why if you were to take the internet, like as a whole, it's like what, the third or fourth, and you kind of compare it to the different countries in the world in terms of energy use, the internet itself would be like the third or fourth largest polluter of an consumer of energy. So the point I want people to take away from this conversation again, I wish we could go on for longer. But I think the point I want people to take away is if you're going to be working on a app or working on a website or something, don't just think about the energy usage of an individual using it, think about how many people are going to be accessing this information and find ways that you can shrink it down so that you're using and being as energy efficient and optimizing your resources as you possibly can.
Solomon Anderson 52:00
Yeah, whether that's in terms of, you know, the databases you're using, the code that you're using, like you mentioned, you know, keeping that as clean and light as possible. Or even the way that you're doing advertising, right, like, all of these things can boil up into that idea of, you know, being resource conscious, and not necessarily thinking with a scarcity mindset, but just realizing that there is scarcity and how best to use, you know, utilize the resources that you're going to be using anyways. How can you make the most of that kilowatt of energy? Is it going to be, you know, running a large code set? Or is it going to be, you know, putting energy into making sure your code is as light and scalable as possible? So that, you know, in the event, you do have a million users, you're not, you know, using up a million kilowatts of energy, because, you know, that's super unsustainable, and hopefully, you're not doing that.
So, you know, thank you so much again, Tim, appreciate you inviting me on to Constant Variables. I hope the audience out there has, you know, a great rest of your day, evening, afternoon, wherever you happen to hear this and feel free to reach out. You know, I'm on LinkedIn, Solomon Anderson. I'm on Twitter, 1991DBA. So my, you know, Twitter users out there, feel free to send me a tweet, DM, happy to be a resource to anyone out there who has questions about infrastructure engineering, data, shoes, you know, I'm a shot. collector, so I'm happy to talk sneakers with you too. Tap tap pull to everybody out there who, you know, might be a collector. And yeah, that's about it.
Tim Bornholdt 53:55
It's great when you're the host, and the guest kind of wraps themself up. That's phenomenal. I don't think I've experienced that. Solomon this was awesome. And I wish we could have gone longer. I had to call it early. But I do really appreciate you coming on the show. I hope people go and find you on LinkedIn and Twitter and pick your brain a little bit and I hope that you and I can connect in person in the future.
Solomon Anderson 54:15
Yeah, I hope so. You know, happy to go, get out, grab some drinks, maybe grab some lunch, but whatever the case may be. Happy to connect. So whenever you have some time, let me know.
Tim Bornholdt 54:26
I love it. Thanks, Solomon.
Solomon Anderson 54:27
Yeah, thank you, Tim. Have a good one.
Tim Bornholdt 54:31
Thanks to Solomon Anderson for joining me on the podcast today. You can connect with Solomon on LinkedIn at the link that we will place in our show notes. Speaking of those show notes, you can find show notes for this episode at constantvariables.co. You can get in touch with us by emailing Hello@constantvariables.co. I'm @TimBornholdt on Twitter and the show is @CV_podcast. Today's episode was produced by Jenny Karkowski and edited by the happy go lucky Jordan Daoust.
As I mentioned at the top of the show, if you could take two minutes to leave us a rating and review on Apple Podcasts, we'll return the favor with a mention in a future episode. Just visit constantvariables.co/review, and we'll link you right there. This episode was brought to you by The Jed Mahonis Group. If you're looking for a technical team who can help make sense of mobile software development, give us a shout at JMG.mn.