We Live to Build Logo
    49:392023-08-22

    I Was a Priest. Now I Build Tools for War.

    How does a man of God reconcile his faith with a career building technology for the military? This is that story. I Was a Priest. Now I Build Tools for War. Tom Pierce, a former US Army officer and pastor, shares his incredible journey into the world of defense contracting, where he builds high-tech solutions for the military while grappling with the moral complexities of his dual calling.

    Faith and BusinessMilitary TechnologyMoral Leadership

    Guest

    Tom Pierce

    President, I2S Defense Contracting

    Chapters

    00:00-My Life's Work: Expecting Good From People
    04:06-From the Army to the Computer Field (A Happy Accident)
    08:23-From Soldiers to Coders: The Army's First Analysis Team
    12:31-The "Bully on the Bus": A Pastor's Take on Just War
    25:00-A True Story of How Broken Software Leads to Loss
    33:03-The Sacrifice of Intelligence: Why We Value Convenience Over Skill
    37:05-Why Your Data is an Unfiltered, Polluted Lake
    45:22-The Most Important Lesson I've Learned

    Full Transcript

    Sean Weisbrot: Tom Pierce is the president of Integrated Information Systems Incorporated. He has become an expert in areas such as integrated business planning, cost and schedule analysis, and cross-functional collaboration. Tom's approach involves combining human intelligence with innovative software. Allowing him to successfully bridge the gap between human and computer systems and provide lasting value to his clients. He's a highly experienced problem solver with a diverse background that includes military service, pastoral service, and software service. In this episode, we talk about his experience in the military and how it shaped the way he approaches information science and business in general. Then we move on to how. The military service created his desire to be a pastor and how what he learned as a, as a soldier helped him in being a pastor and how what he learned from his community in being a pastor helped him be a better business leader because he is a pastor and a business leader at the same time where he's working with. Large defense contractors that help the Department of Defense in the United States. And so what he does is very top level. It's very high level. It's often very difficult work, and we talk about a little bit of it that he is allowed to talk about. Then we talk about data management and how difficult it is for us to be able to actually get the data that we need to have and organize it in a way that we can deliver it to people in an efficient manner. I. And then we end on what's the most important thing he's learned in life and it's so important. So I really hope you stick with us to the end and hear everything he has to say because it's a tremendous amount of value and I had a really great conversation. So I hope you enjoy this interview with Tom Pierce. So Tom, tell me about your experience in the military. What was it like? I. Did you enjoy being told what to do all the time?

    Tom Pierce: Certainly. It was, well actually, I, I ended up not being told what to do all the time. Uh, the military training drilled that into me real heavily. Uh, and that was about, uh, nine months of training. But it was actually on the first day I reported for duty. I was supposed to be, uh, an officer in nuclear ammunition. Logistics company. And my very first day of reporting, uh, my, my new commanding officer informed me that I had been reassigned like minutes before I even showed up. And, uh, they had seen my math background and my interest in computers and yeah, this is 1981, so it was pretty early and uh, they were just forming this analysis team. So I got the incredible opportunity to be a brand new second lieutenant. Included in a very senior, uh, combination military civil service analysis team am, uh, analyzing ammunition and missile maintenance logistics for the army. And pretty much my entire four year tour was spent in that analysis team, learning from military and civil servants, much senior, much wiser to me, uh, in the very early days of the Army's use of computers. To try to manage the logistics, simulation, modeling, all kinds of fascinating things. And, yeah, I, I used this old desktop that had a dust cover on it, that was on my boss's desk and nobody was using it, so I asked if I could just play with it. Basically, I used it to do briefing slides and then that grew to simulation modeling or the logistics implications, et cetera. So it was just quite by accident that I ended up, uh, in a computer field while still in uniform. And then when I got out of the military, I pursued it further.

    Sean Weisbrot: So I've, I know a number of Israelis and a lot of them end up specializing in something while they're in the military, and oftentimes afterwards, uh, they will end up getting into a field or, or creating a business around the thing that they were working on in the military. And they'll have full right to create some sort of IP around what they've done. Do you think there's a similarity here, uh, between the US and the Israeli armies, or is it not common to have done what you've done?

    Tom Pierce: Well, uh, I, I don't know if it's common now. It certainly wasn't common then. You know, computers were, were brand new and I'm not even sure I had heard the term intellectual property rights back then, but certainly the things, the software that I wrote while in uniform belonged to the US Army, um, after I got out of the army. Uh, I spent, uh, a number of months in minimum wage jobs before a defense contractor who knew of me from my time in service got a hold of me and wanted to use my experience. And then I started developing software for them for about seven years. And then after about seven years with them, I finally decided I wanted to kind of go out on my own. And I still didn't have the IP rights to anything I wrote because I was modifying code that somebody that Boeing owned. So it really took me a long time before I, I, uh, negotiated with my employer and my customers the right to retain the property rights of the software I wrote. And, and that's quite a battle. It's still a very difficult field. But I ended up having to retain the services of a, of a, of a rather, uh, talented intellectual property rights attorney to help me negotiate the terms and conditions of my contracts. But, uh, yeah, it, it, it's still slippery and, and I think in some ways it's getting even slippery, slippery, or as far as, uh. What are the boundary lines around what I wrote versus what I, what ideas I borrowed from things I've seen elsewhere or, uh, got off the internet. That's even a really interesting, uh, dilemma there.

    Sean Weisbrot: So I would definitely love to talk about that in a little bit. Okay. Before we do, I'd like to know a little bit more about your experience in working for defense contractors in those early days. Um, so you didn't have any rights over what you were coding, and I think that's common. I think when I had my tech company, we had people sign away the right to the code that they were developing. Um. Because we were paying them for that code, so why should they have, uh, you know, the right over that?

    Tom Pierce: Exactly. And, and in fact, for me it was third hand, it was the US Army Ballistics Research Lab that was paying my employer to write code for them. So I wrote code for my employer who, you know, so the code belonged to the US Army and, um, we, we just kind of retained some of the basic ideas and structures and. You know, those are yours. Uh, you, you, the thoughts, the education, the learning, as long as you're not actually, you know, cutting and pasting, copying exact proprietary algorithms or anything like that. And I didn't do any of that. But certainly my way of thinking was very much shaped by the simulation modeling and logistics applications that are written both while in uniform and, uh, under the ballistics research lab.

    Sean Weisbrot: So why don't you talk about that? That's a good, important angle, I think. How, how was it shaped? What was it specifically that you experienced that created, that created that opportunity to be shaped?

    Tom Pierce: It grew amazingly quickly. Um, once soldiers in the field started to get their hands on different forms of computers, and we are really talking dark ages here, Commodore 60 fours, TRS, eighties, but, but you had, you know, the new young officers. Wanted to use computers to make their jobs easier and the jobs of their troops easier and, and literally the guy in the desk next to mine was still the old, the old meaning of the term spreadsheet was a very large piece of paper with rows and columns lined and rulers and pencils and pencil sharpeners all over the place. He was doing his analysis. On spreadsheets with sharp pencils and handling machines. And I was helping him do his job easier, uh, by programming and watching what he did, learning what he did, and, and programming it in a basic way. And then that grew very quickly into logistics modeling that I could use in support of, uh, am people that were managing ammunition supply in the army. And some of those guys found out about what I was doing. And they were traveling at, I was located at a, the, the center in school where they came for their continuing education training in the military. And so I would get to know them, they would see what I would have done. They wanted me, you know, they would get involved. I started going to field exercises demonstrating, and we ended up presenting a Pentagon several times. It just grew so rapidly when people started saying, I can use this computer to help me do something very difficult. Mathematical tedious job and, and then it got into the world of simulation modeling and logistics planning. Very oddly, I mean, it still kind of blows my mind that it just fell into this, but we're talking early eighties and the people behind the cyber lot, no window door right across the hallway from me. We're working on this ridiculous scenario of how are we, how could the US army provide ammunition to troops in a desert war? Desert War 1980. What are you talking about? We were planning for Germany or you know, something like that. But somebody way back then was, trying to model the distances and logistics challenges involved in resupply, particularly the ammunition all, all the way from OCONUS to to the Middle East. And so I did a lot of the coding for that kind of simulation modeling and it just grew from there.

    Sean Weisbrot: So. Do you think the work you had done in the eighties was used in the first Gulf War?

    Tom Pierce: You know, I, I wish I knew that, uh, certainly some of the ideas were, uh, we were involved in a very collaborative environment. You know, I would travel over the country and meet the same people in different cities 'cause we were all briefing the same concepts to different field commanders. And from the way it played out, many of the concepts that I was part of, I was involved in those discussions. Uh, certainly seemed to have borne fruit in, in the, the, the way the conduct of the first Gulf War went. The logistics were really quite amazing the way they played out. Um, and actually my first army boss ended up leading a tank battalion in that war. So, uh. Yeah, the connections are there. I just don't know how strong.

    Sean Weisbrot: So this led you, let's say, out of the military, right? And into continuing to do what you were doing, and you also started to become a pastor. How did you reconcile trying to do something good for a community while doing something potentially? Deadly against other humans in other countries.

    Tom Pierce: That's a, that's a, that's a profound question. I'm glad you had the boldness to ask it. Um, my father was a career Air Force. My uncle was in the army. So many of these issues I had kind of wrestled with through childhood and adolescence. Um, even in one of my ethics classes in seminary, uh, it was taught by one of the world's leading. You know, most renowned experts on biblical ethics, and the thing that sort of made his reputation was his concept of just war. What, what makes a war just under what circumstances? Is it good and right to step in and defend someone who was being attacked from an oppressor? And of course, a lot of this, you know, grew up in, in the distant shadow. Of World War I and World War ii, but then also informed by some of the really difficult ethical, uh, situations in Korea and Vietnam. And I thought that the Professor did such an outstanding job. And as it happened during that semester was when the false first Gulf War started. So we were in class using his textbook of just peacemaking. Uh, and analyzing the, the actions and decisions of the US and its allies in whether or not to liberate Kuwait from Saddam Hussein. And it may just be absolutely fascinating debate and discussion. I've never felt like the answers were simple and clear. Um, it was actually, you may be familiar with the, uh, science fiction writer Orson Scott card. I heard him give a lecture. That had nothing to do with his writing, but, but everything to do with his political views. And the metaphor he always used was the bully on the bus. If you're sitting there on a school bus minding your own business and some bully in the back starts picking on a young kid, at what point do you stand up and intervene? And I think it's a wonderful metaphor for, you know, is it ethical to sit there and do nothing? When, when some bully is picking on some kid and you know, you could do something, it is, it's a wonderful metaphor to explore all of the dimensions of when is doing nothing, the right answer and when is doing something the right answer. And, and I've always felt, and I, I think I get a lot of this from my dad and, and my uncles, um, that if to the degree that you have trust and confidence in your entire command structure. You don't have to understand the whole picture of what, what your contribution is contributing to the war effort if you trust that the cause is good and being well, well conducted, conducted ethically. And I think every generation of soldiers for every country wrestles with that in different ways. But I've always considered myself a passionate, moderate person. I believe there are excesses on both extremes. And, and if you'll forgive kind of the lighthearted golf, uh, metaphor, I'm a very bad golfer. I am equally likely to hit the ball out of bounds to the left as I am to the right. Hitting the center fairway is a real challenge for me, and I think that's true of any really difficult ethical dilemma. So in many ways, I felt like my service to the United States Department of Defense was just as much a noble calling. Is my effort to minister to church members and communities through pastoral care and scripture, trying to use whatever skills, whatever capabilities I have in, in, in causes that do good. It is, that's a, it's a never ending lifelong battle. And even, you know, certainly the movie Op Oppenheimer just came out. I haven't seen it yet, but I'm not sure anybody has ever faced a greater ethical dilemma. Then, you know, the development of nuclear energy and nuclear weapons, can they ever be used for good? Um, you know, not me for, not for me to answer that question for anybody else, but I do believe there is a time for war and a time for peace. I think both are true.

    Sean Weisbrot: Hey, just gimme 10 seconds of your time. I really appreciate you listening to the episode so far, and I hope you're loving it. And if you are, I would love to ask you to subscribe to the channel because what we do is a lot of we, and every week we bring you a new guest and a new story. And what we do requires so much love so that we can bring you something amazing. And every week we're trying really hard to get better guests. That has better stories and improves our ability to tell their stories. So your subscription lets the algorithm know that what we're doing is fantastic and no commitment. It's free to do. And if you don't like what we're doing later on, you can always unsubscribe. And either way, we would love it, like if you don't feel like subscribing at this time. Thank you very much and we'll take you back to the show now. I haven't seen the movie yet. Okay. As of recording this with you, however, the 27th of July, which is tomorrow. Okay. I will be seeing the movie. Excellent. And a friend of mine wanted me to go and see it in IMAX with her, so

    Tom Pierce: That's what my wife wants to do as well. I'll, I'll be anxious to hear your reactions, you know, afterwards.

    Sean Weisbrot: So what do you think? The Ministry has helped you to do some good based on the work that you do, because based on the last question I was asking you, how do you reconcile doing both? But now the question is more about, are you doing pastoral work as a way to help you feel better about your life as in general, because of the work you do? Probably does lead to some people dying, even if you don't see it.

    Tom Pierce: Yeah. I love the boldness of your questions. Uh, my actual reaction is quite the opposite. Um, I have found that what I learned in seminary has been more beneficial to me in working within the defense department, within the technology arena, in dealing with a lot of very human struggles and crises. Then it was in the pulpit. And I actually found that in my work in churches in ministry, that the congregations I was pastoring tended to be older and wiser than their pastor. And I ended up learning more from their wisdom and experience. And many of 'em were, uh, like I I, I started pastoring when I was 35, and many of my church members were already in their seventies and eighties. So I was learning generational truths from them, even as I'm the one in the pulpit opening up the scriptures and sharing my insights. But it was incredibly collaborative and that has fed over into a way of interacting with people that is respectful and collaborative. That I think even too many, too many preachers and too many software engineers, uh, take the role of you sit and listen. I'll tell you what to do. I'm the smart guy in the room. I'm going to write the code. You know, you need to modify your business practices to conform with my computer model and my software architecture in much the same way that too many preachers say, let me tell you how to live your life. And I think both are backwards. I think both roles are better thought of as a service. Matter of fact, I kind of get a kick out of the fact that, I don't know, was it 10 years ago the phrase software as a service became a thing? It's like, alright, I kind of have been in the military service and the, you know, pastoral service. So I don't mind being in the software service, but it's the same theme. What you're doing, you're doing to help people and you're doing to help people do whatever it is. That they felt led and called and inspired and paid whatever, whatever motivates them. You, you're helping individuals do what they do. There is a threshold, there is a point at which when you feel like that whatever the person you're trying to help, wants to do is something you really can't support, then you back away and, and I've done that even in uniform and I've certainly done that as a defense contractor. There are ethical thresholds of. Honesty and transparency and integrity of just every flavor where everybody makes a choice. You know, the excuse of I was just doing my job really doesn't fly in any of those roles. You're, you're always making decisions of whether or not to continue to support whatever endeavor you're a part of, and it always takes courage to walk away and say, I'm not gonna help you do that. You know, but even back to the school bus and the bullet, you know, at, at what point do you stand up and at what point do you sit down? Time to do both.

    Sean Weisbrot: I was, uh, thinking about my experience with software developers and, uh, in a, a previous business, my CTO, I don't think it was like, you need to change your business practice based on this architecture, but it was like, I need you to be clear with me what. You want this thing to do so that I can make the architecture work in the way that you need it to. And then based on that, let me tell you the limitations of what we can do.

    Tom Pierce: Yeah. Um, to me there is a very strong temptation to think of software development and technology in general as something that you have to have the entire game plan figured out before you even start. And then, you know, once the plan is concrete, now you just go execute that plan and just do whatever you do, however aggressively you need to play it to get everybody else on board, you know, get the chief executive to force people to do things The way you architecture was designed. And again, you know, part of the military planning is of course you go into war with a plan, but you better have a contingency plan. And you better have multiple contingency plans and you end up emphasizing preparation more than planning because you have to be prepared for the thing you don't see coming. And that's so applicable in the software industry. And I see so many, uh, software company, software products fail miserably when things don't go as planned. And you know, I've even had people working for me, not for very long. But who, you know, when I questioned an error that came out of a piece of software they wrote, you know, they always blame the users. I was like, no, no. The software is supposed to help them, not punish them. And I'm not going to name any brand names now, but I think everybody is familiar with some of the software products that are very punitive in their nature, that if you don't do everything right, then the user is going to be pounding on the keyboard and ready to throw the computer out the window. It. Even in day-today, or, you know, and I want to kind of share one sort of deeply personal episode, but more lightheartedly, you know, the, the, um, cashier at my local, uh, pharmacy gets, she's wonderful. She knows me and my family. She's, she knows her, you know, she's checking on everybody, but she can't stand the software that she has to work with. To try to get me what I need. And, and I am totally in sync with her because I get messages from their company that are wrong and their technology is horrible. The humans are great, but the humans get, get in an adversarial relationship with their technology. So the more poignant piece of that, uh, my mother-in-law passed away about, about a year ago. And she had been dealing with a, a very long and very rare, uh, blood cancer and there ended up being two different hospital networks involved in her care. And because of a lot of privacy rules and, and a lot of firewalls between systems, the doctors that were caring for her were not able to collaborate with each other. They weren't allowed to share data with each other. And so at the point when she was like, you know, not able to communicate well, we would have one doctor come see her and tell us one thing, and then another doctor come see and tell us something different. And they weren't even allowed to collaborate with each other. And it got so bad that there ended up being, you know, we knew there had been a test conducted that revealed something bad from the facial expression of one doctor. Who was not at liberty to share the information with us 'cause he didn't have the permissions. And then we finally found another doctor that said, let me go find out who ordered that test and then we can get the permissions. They had just installed a new software system and he wasn't able to navigate the software system to figure out who ordered the test. As a software developer that was just heartrending. That shouldn't happen. Software should not get in the way of people who were trying to do good for people. It should be helping the doctors not presenting obstacles to them. And I think the healthcare industry is kind of parallel to the defense industry and how unbelievably bureaucratic regulated, and you know, how the compliance dictates the behavior of the software. To such a degree that you end up being punitive, being an obstacle, being a barrier, and just frustrating people. I, I, I think for most people going to the hospital, going to the doctor is a stressful experience, in part because the technology's so poor, because people are so frustrated with just trying to do what basic humans would just naturally do anyway.

    Sean Weisbrot: Well, I'm sorry to hear about your mother-in-law. It's a very tragic, uh, and unnecessary thing that she went through and. I can't say that's happened to us, to my family, but I did experience something a few years back where, uh, my dad had gone to the hospital 'cause he was struggling to breathe and they found 15 pounds of fluid in his lungs and his legs and he, he was an hour away from dying. Um, and they were able to save his life and they had to replace his aorta. But I didn't know anything about hearts and my, my mom and my brother, nobody, we didn't know anything about hearts. My dad was born with a hole in his heart, so we knew that he had an issue before and they patched it when he was a very young kid. He had like the second open heart surgery in the US ever to happen. And Wow. So we knew inevitably something was gonna happen with his heart. It happened about five years ago, uh, that he needed to get his aorta replaced and. I found myself needing to be an advocate, and I was in the hospital every day, all day long, fighting the nurses, fighting the doctors. They all seemed like idiots. Inevitably, I realized that he was with the wrong doctor and he needed to get to another hospital immediately. And luckily my aunt, uh, my uncle's wife is a cardiac surgeon at another. Network. So I got her to recommend a cardiac surgeon who knew my dad's situation based on like his genetic, uh, you know, profile. And we got him moved over and then he got the care he needed. But it took me like a week of fighting, like just to make sure they were giving him the right meds. I had to learn how to read his, his chart. I had to learn how to understand. What's a normal, um, you know, heart rate for someone like him? What's a normal blood pressure for someone like him? Right? Because I, I understand for like myself, who has a normal heart, but how does that change when you don't have a normal heart? And nobody around me had any knowledge and so I had to teach myself on the spot in order to keep him alive, literally. And it was shocking to me how ill-equipped the nurses and the doctors were. It was shocking.

    Tom Pierce: It, it absolutely, and, you know, another peril and, and, and deep sympathy and, and, you know, great respect for your energy and advocacy and, and I think, you know, too many people don't understand just how much initiative and how much energy it takes to navigate the world. Well, because the world becomes increasingly complex. And, and a lot of people just get steamrolled by the complexity because they, they simply don't know what to do. Yep. But one of the other common threads that I found between, you know, military service, pastoral service and, and, and, and software is, is that if I condense it to a simple mantra, well-informed people make better decisions, and you were fighting to become well-informed in the heart of the information age. Uh, when, when information is theoretically everywhere in super abundant supply, and yet it's, it's water, water everywhere and not a drop to drink, you, it is still incredibly difficult to get the information you actually need to function in the most critical moments of life and make the most critical decisions. It's not a lack of information, it's just a lack of useful information at the right time, at the right place, and that's the logistics part. You know, you, I kind of grew up trying to get bullets to the right place at the right time. Now I'm trying to get data to the right place at the right time, and the, the challenges are remarkably similar of, you know, not only the distribution of it, but you know, what, what is it? Newton's third loan. I think that no transformation of energy is a hundred percent efficient. No transformation of information is under percent efficient. Either something is always lost in translation, and the more onion layers it goes through from, from source to destination. The less trustworthy it becomes and the more painful it is to extract. And I think that's the, the challenge of the new era is with all of this information flowing and so much of it automated, you know, automated to degrees that you can't even fathom the, the, the sourcing, you know, how many hops has, has this audio and visual signal gone through between me and you And here we are communicating as if we were face to face. We don't comprehend it. And yet if we had to. It would be unbelievably challenging to, to really come to a full understanding of everything required for us to even share information with each other. And, and that's the complexity of, of modern life where it used to be a whole lot simpler, uh, funny story. Uh, try to be brief. Early in the days of, of doing the ammunition programming, we were actually involved. We, we had a teammate that was developing, uh. Uh, state-of-the-art technology to encrypt and decrypt signals received and transmitted over FM radio, and, and then we added in the wireless modem, the, the world's first wireless modem. As far as I know. And so everybody was fascinated. The generals were super impressed, but everybody wanted to know what happens if the technology breaks. And we actually had to put in writing the contingency planning all the way down to the last resort as a soldier on a motorcycle with a floppy disc. We will get the information to point A to point B any way we need to. And it's kind of remarkable how even today when technology fails, you know how many people can still pull out a pencil and paper and do the math they need to do. I. Hopefully a lot, but not as many as there need to be. Because technology fails.

    Sean Weisbrot: I think that's a good point. I've made this a few times before where it feels like as things become more complicated because of technology, we find that every successive generation becomes less proficient at using their hands and their minds.

    Tom Pierce: Exactly. Exactly. I believe it was Thoreau that said, for every advance in technology, something is sacrifice. And that's our intelligence for everything. That becomes easier. Yeah. We, we, we give up something in order to get the, the benefit. It's, it's a little bit like, you know, I am middle aged down, senior citizen overweight outta shape, play tennis once a week. I, I, if, if I would actually walk where I'm going two miles away instead of drive, I would be in better shape. But I really love the air conditioned car and the convenience of driving. I give up my physical fitness for the convenience of driving even short distances, and, and I do that a thousand times a day,

    Sean Weisbrot: which is why I like living in Europe and Asia because I have to walk everywhere. Yes. And therefore I'm more fit.

    Tom Pierce: I one semester overseas in France, uh, in, in college and was just so amazed at how the. Community structures are so different. This was a fairly large city in DE and France, and almost everybody walked everywhere because everything you needed to walk to was within walking distance and mm-hmm. Cities just aren't designed that way. Uh, some of the newer neighborhoods are, but yeah, it's a, sometimes we can rediscover old truths. That it makes sense to create communities that encourage walking and bike riding and actual face-to-face interaction rather than miles of interstates and airports. True.

    Sean Weisbrot: Well, we could say a lot more about that, but I would like to go back to the point you made about how you spend your time now figuring out how to get as much data to the right place as efficiently as possible. So let's talk about some of the difficulties you experience or the world experiences in that regard and, and maybe, uh, how it can relate to the business that you run.

    Tom Pierce: Certainly, certainly. Um, I, I've more recently kinda landed on the metaphor of water distribution. And how that applies to data distribution. And you'll hear a lot of people talking about data lakes, and now I'm hearing more and more about pipelines and, and that kind of thing. And, you know, if, if you've ever even just, you know, examined briefly the, the flow of water from its source to your kitchen faucet. Uh, you know, there's a lot of processes involved and it starts off being non drinkable. I grew up in the mountains of east Tennessee. We used to drive up to the mountains to get pure water, and, and then when I returned as an adult and was told, you better boil that or filter that, or some treat that before you drink it. Why? I thought this was the cleanest water on earth. Well. All water on Earth is less clean than it once was. And so now you're going through all of the, you know, layers of purification and at what point is it drinkable and what point does the city put out a boil water advisory? I think that's why it happened with data, with the absolute explosion of the availability of data. Even as you've heard, uh, sort of the people in the profession switch from talking about databases to talking about data lakes, part of what that means, I'm gonna come across a little cynical, but it's intentional. Part of what that means is that the, the task of filtering the water to make it drinkable has been shifted away from the data collector 'cause he just put it in the data lake, drink at your own risk. And when you come, you know, dip your bucket into the well of the data lake, you need to make sure that it's clean, reliable, trustworthy, and, and fit for consumption. Because the data processing system professional decided to stop at getting it to the lake. But now there's a whole new set of data scientists whose job description is more, okay, let me take the data that's in the lake. And cleanse it in accordance with the business rules or validate and verify the sources. And then you got the whole debate about what's true and what's misinformation and what's a deep fake and what do you trust. And this is not all that different from trying to figure out which version of ancient text is most reliable from 3000 years ago. It's not obvious what, what is true and what is not true. When you're pulling it out of a data lake, so now you've got the whole data processing. That also involves the human and technological integrity of trying to ensure that it is fit for consumption and the more human involvement that is necessary to filter and analyze data to make sure it's fit for consumption. The more you've opened the, the opportunity for human deception, because all of us humans have parts of us that are less than a hundred percent pure. So, so the task of distilling data into information, into intelligence, into wisdom is growing much more difficult, involving many more layers of people. And, and I'll go ahead and share with you that, that one of the best compliments I ever received. Was from one of my clients who told me, everybody else gives me data. You give me information. That's, that's exactly what I hope to do. And it, no, it's not easy. And, and there's, there's millions of us that are involved in that effort and everybody's got their own story. Uh, I'll tell you one that just came up today, and I'm gonna try very hard not to throw anybody under the bus, but, uh, my biggest client is using the world's largest enterprise software system. Sort of because when they really want to know when is that part going to be delivered, it's just too much trouble to get all the humans and the organizations and the functions to ensure that the data is entered and processed correctly. It's just way easier to call the vendor and say, okay, off the record, no nonsense. When are you gonna ship it? And the in practice, they find the answer on the phone call. To be more credible, reliable, and trustworthy than any of the data that's entered into the world's largest enterprise software system. I get it. Too many layers. I'd rather go direct to the source, you know, voice recognition inside my ears. I know I'm talking to somebody I trust. If he tells me it'll be here next Tuesday, it'll be here next Tuesday. I don't think we've figured out how to digitize that yet. I know we're trying. But there's just almost no conceivable way that you can ever completely replace what it means to build trust in a, in a source of information that is direct. We're, we're gonna keep trying, but, uh, it's challenging,

    Sean Weisbrot: so I can't say I've experienced that at your level, but this past weekend I helped my mom to get a car and. I can tell you there was so much misinformation in the process. It was almost hilarious. Yes. So one part of it was we, we went to a dealer and that dealer said, we have a car that you would like, but it's at another dealership. And to be honest, it would cost us more and it would be, we wouldn't make any enough money. To sell you this car, you're better off just driving over to the other, the other one, it's the same, it's the same, uh, company, but a different dealership brand or No, no, it's, it's the same dealership brand, but it's a different branch. One focuses on Honda. One focuses on Acura. Okay. And. We were at the Honda one and we wanted to see the Acura because they have an internal network showing all of the branches and all of the cars available at those branches, even though they're selling different brands of cars, and for the Acura division to send the Honda Division at a different branch, their car, they would lose money on the sale. So when we got there, I. When we were looking at the list, it was saying, oh, this is a white exterior and a black interior, right? So you have all this information and you have things like the mileage and all of that. So we saw what we saw. We drove over there. That was not the truth. One of the cars, the car we ended up buying said that there was 20,000 miles on it. There was 20 2005. Um. The, there were cars. It was like, oh, it's, it's black on black. It was actually a black exterior with like a brown interior. There was one that was like a blue and gray it said, but it was actually blue outside. Black out, uh, black inside. Um. There were ones that the Carfax said there were no problems whatsoever, but we saw, and the, it had clearly been in, uh, in a front end collision, even though our engineers checked it, there's no problems and the Carfax is clean. Sorry, there was damage. I can put my finger in between a gap where the hood goes, between the hood and the, and the, the front bumper. Sorry, that's damage right there because it's supposed to be very close to each other. So we noticed so many inconsistencies and we finally were able to get a car. We liked, but it took hours and I literally spent 30 minutes looking at every tiny piece of the car inside and out, including the engine. And I'm not a mechanic, I'm not an engineer. I'm just a guy with a good eye. And so like you could even talk to the humans there and they won't even know the right information because someone someone else processed that someone else put that information in.

    Tom Pierce: Right, right. It's like fragmenting responsibility. And, and you, you're incentivizing people to not know. You're giving people incentive to stay in their lane and stay focused only on their task and the computer or the system or, or the organization is supposed to take care of everything else. And I think you end up just degrading and eroding what for Millennia has been an expectation of human to human trust building that, you know, once upon a time it'll land far away. You could go to a car dealership and buy a car from a guy you've bought a car from three times before because you trust him and you trust when he says he's inspected this car, he inspected the car, or maybe he personally trained the inspector who inspected the car. And if there's anything wrong with it, you take it back and you know he'll make it good. We've, we've eroded away an awful lot of those layers. Of human trust validation, verif verification. It, it's, it's very difficult to sustain in, in a highly automated, highly global sense. It's, it's still very local in reality to, you're just better off dealing with people. You know when you can,

    Sean Weisbrot: what's the most. Important thing you think you've learned so far in life?

    Tom Pierce: Oh, very, very good. Um, I, I think it is really good to approach every human you work with, with the presumption of innocence and the presumption of good faith. Um. I do consider myself a skeptic. I consider myself most times a grumpy old man of the balcony because I've been there, done that, been lied to too many times, but it is still better overall to to treat each new encounter, each new relationship with a tremendous benefit of the doubt. And a lot of leeway for mistakes. A actually one corollary to that I learned from my, my son, who's been now been working for me for 15 years, and, uh, he, he, he's an optimist by nature. He, he, he believes nature is conspiring for his happiness if only he'll pay attention to it. But when, when things are done poorly or, or, or badly, uh, his default response. Is to alway to, to never attribute to malevolence what can easily be explained by incompetence. Don't ascribe ill, will, don't, don't ascribe evil to somebody who might just be having a bad day or be overwhelmed or be untrained or, or any number of reasons that they didn't perform well. Don't, don't assume. That it's, that it's evil and malicious and malevolent, that they're, you know, that they're trying to hurt you. And I, I do have a tendency after so many times, you know, how many times do you overlook an error, you know, three times or 70 times seven Now, you know, to get into the New Testament a little bit. I, my patience wear thin in my old age. And I have to kind of surround myself with, with younger, brighter eyed people. Who are still willing to look optimistically at human relationships and human intentions. And I think there are corrupting influences that you have to watch out for, but if you, but if you don't misalign your incentives, I think pretty much everybody goes to work hoping to do a good job. I think everybody wants to be good at what they do at some level. And, they want to do something that is helpful to people until you impose all the perverse incentives that say, I'm going to get punished if I do that, or I get rewarded if I don't do that. I make more money if I don't send you to the dealer. You want to. When, when the, when the incentives are perverse. Humans behave badly and, and the more the pressures are turned up, the more badly they behave. But I, I still think the default position needs to be, expect good, expect good out of people, and expect it of yourself as well.

    Network
    Before
    You Need It

    How I generated $15M for my businesses and $100M+ in value for my network.

    Sean Weisbrot
    Sean Weisbrot
    We Live To Build

    Network Before You Need It

    How I created $100M+ in value for my network
    and earned $15M for my own businesses.

    Delivered as 6 lessons I learned from experience as an entrepreneur.

    Subscriber 1
    Subscriber 2
    Subscriber 3
    Subscriber 4

    Join 235,000+ founders