Search
Close this search box.

The Main Scoop, Episode 15: Toward a New Humanities in Computing

The Main Scoop, Episode 15: Toward a New Humanities in Computing

The advancement of generative AI has left some wondering if we’re moving away from our own humanity. But what if these innovations are bringing us toward a new, improved state of human consciousness in computing? Learn more in this episode of The Main Scoop, our hosts Daniel Newman and Greg Lotko are joined by Reg Harbeck, CEO and Chief Strategist at Mainframe Analytics.

It was a great conversation and one you don’t want to miss. Like what you’ve heard? Check out all our past episodes here, and be sure to subscribe so you never miss an episode of The Main Scoop series.

Watch the video below:

Listen to the audio here:

Or grab the audio on your streaming platform of choice here:

 

Disclaimer: The Main Scoop Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.

Transcript:

Daniel Newman: We could probably just have generative AI create the script for the show and it would be better.

Greg Lotko: I don’t buy that. You’re the guy who talks to me about having to write our emails and all that kind of stuff, I am-

Daniel Newman: Well, it would probably get to the fact that I’m a bestselling author quickly.

Greg Lotko: I don’t know. So when I get into generative AI and writing emails and all that kind of stuff, we’re both in leadership roles. I really believe you get better leadership when people see who you are, where you’re coming from. And I don’t want to delegate that off to a technology.

Daniel Newman: Once it gets to know me. I mean, anyone that’s followed our show for a while they know that I am the nice one, and you’re Greg Lotko.

Greg Lotko: All right, you might’ve just convinced me. I think what you were trying to say is generative AI would make you a better Dan Newman, and after that last comment I’m willing to give it a shot.

Daniel Newman: Well, the question really becomes does it make us more human or does it make us better human?

Greg Lotko: I don’t know. But I think we have a guest that can talk to us about that.

Daniel Newman: All right, so let’s get after it.

Greg Lotko: Joining us today is Reg Harbeck. He’s the CEO and Chief Strategist at Mainframe Analytics has a degree in computer science. But kind of like we were talking, you wanted to bring humanity into your world and your view, your perspective. So you went back to school you got a master’s in Interdisciplinary Humanities, right?

Reg Harbeck: Yes sir.

Greg Lotko: So what do you think?

Daniel Newman: First of all, what is that? What is that?

Reg Harbeck: It’s a degree that they offer at a local university where I live about English and history and philosophy. And as I looked at moving forward, having spent 35 years as a Mainframe nerd and seeing how to sort of weave the humanity back into that, it looked like an excellent opportunity to really bring that part out and discover the humanity of computing and especially of the Mainframe. And so it was a really neat journey to have that question present in every course that I took as I learned about the history of philosophy, as I learned about all these other things. You know studied the history of science fiction, studied poetry about technology, all these things, trying to see how it all fit with computing and especially the origins of computing and the computer that really so implicitly helps all of humanity by running the world economy.

Greg Lotko: And when we talk about this think about this, because he got right there when he was describing it, you didn’t start by talking about the Mainframe. You talked about being a mainframer, right? So he brought it right to the human part of that whole ecosystem interaction.

Daniel Newman: Oh, you heard he said English verbs, nouns, adjectives. He’s doing that thing and he actually did a pronoun mainframer, right? Capital M or lowercase?

Reg Harbeck: Well, you know lowercase actually and that’s interesting it’s not quite like using a lowercase I had to refer to yourself. But yeah, that it’s just another reference just like anybody else, I guess.

Daniel Newman: Yeah. Well, it is really interesting that you heard how we started the show and we’re talking a little bit about, and there’s so many different factors. There’s generations he’s a couple generations older than me. There’s the particulars of our roles, styles of communication, you’re very soft-spoken very-

Greg Lotko: Yeah, that’s me.

Daniel Newman: … very personable leader. But in serious, because we both lead teams and the way we lead teams, how could things like AI actually fit into the way we can be more communicative, but at the same time do it at a greater scale, but not lose kind of that persona and personality that inspires people. And so I have to imagine, I’d just love to get your broader take with generative AI and AI in general, generative AI and AI in general. This has to be like one of the biggest inflections in history where humanity and technology are just about to collide in a way that’s going to change the way we exist.

Reg Harbeck: Well and certainly I mean, futurists like Ray Kurzweil want us to think that transhumanism means that our technology is literally going to supersede us. And that’s where I take issue with what he’s concluding about this. I think all of human history has been a journey of creating technology that helped us be more human. And that that’s going to be the differentiator between those technologies that last and those that we try them out and they just don’t work out for us. And so for me the essential impact in the short term at least of AI, is something that’s already been happening using auto reply and other things. And I call it prosthetic consciousness.

And what it is we’re recognizing that our consciousness is one of the essential parts of us as humans, but it’s something we’ve always extended with tools. And so that those tools, including AI that allow us to extend but not displace our consciousness so that we are able to be more effective, but more importantly more human. Those are the things that are really going to last in the long term as we continue this very, very long journey of not merely discovering but defining what it means to be truly human.

Greg Lotko: So I want to pick one word in there because I did also study technical communications in addition to computer science. And you said it twice in there you said technology is going to help us be more human.

Reg Harbeck: Yes.

Greg Lotko: More human or better humans?

Reg Harbeck: Okay, that’s a good point. And I think that I’m using the term more human in a sense to imply better humans. But even that there’s no standard.

Greg Lotko: So one I can embrace, the other I have a knee-jerk reaction I want to scoff at it. How does something artificial make us more of something what we already are? Now so I would challenge that, but when I hear the better part, and I think about all inventions and industrial revolution and mechanizing things and all this, it was to make us better at doing something. And it maybe made us more effective but making us more human with artificial.

Reg Harbeck: Yes, and so I’ll stand by that. And one of the reasons is because the whole journey of humanity in so many ways is discovering and defining and refining and building what it means to be truly human. That if you take a look at all these ways that we have risen above our context and distinguished ourselves, it’s by taking that journey. And so every technology we’ve ever invented going all the way back has not been to displace people. But throughout history, we’ve had Luddites who thought this technology is going to displace people. Look at the whole industrial revolution, and certainly a lot of people at the lower social rungs got treated in a very poor manner as part of that. And yet, through the big picture of it all we found ways to be more and more human. And we have not ceased to be human just because there are any number of automations that do things for us.

Greg Lotko: So what you’re saying is we had a plane of existence, a state of being. We were what we were, but we hadn’t realized everything we could become?

Reg Harbeck: And we never will.

Greg Lotko: … and evolve to. And therefore, because we weren’t at the essence of our state the technologies continued to elevate us towards a truer ultimate state that, and I really do like this part of the theory and the discussion, the idea that evolution will always continue. There will always be a next technology, there will always be another extension or form of assistance that we may not have an awareness of today. I mean we could go back to the sixties and watch Star Trek and many of the things that were on there that were fantasy or science fiction have become reality. So what you’re saying is there’s always going to be that next thing that we haven’t even imagined yet and that will make us more human.

Reg Harbeck: And it won’t always make us more human. I mean there’s no question that some of the technologies that have been used over the last two centuries especially have been horrifically inhumane. And yet we’ve learned from those. We recognized that those did not meet the standard, even though it’s a standard that was not clearly spelled out. We recognized that this was inhumane the way it was used and we’ve moved beyond it. Whereas some technologies have helped us become more human and we’ve built on those.

Daniel Newman: It’s interesting. I don’t know, I just looked out the window and I was thinking to myself about why we don’t have flying cars yet. Solving the geospatial and sort of traffic problems with 2D seems pretty inefficient but we’re still there. But we’re just putting 800 pound batteries in things now calling that the solution to everything. I don’t know so what do you think though?

Reg Harbeck: Well, true. I think that’s one of the big challenges in our history is that we have a sense of where we want to go. We don’t know how to get there but just the vision. And sometimes things like Star Trek, sliding doors, Star Trek invented those automatically sliding doors and then we figured out how to do it and where there’s a will, there’s a way. And so that’s a really big part is when you have the science fiction writers, when you have the idea people who don’t know how to get there, they give us the idea that then we say, is this a way to become more human? And we don’t phrase it that way but we recognize it that way. So we don’t always know when it actually happens whether it is but if it isn’t, we keep moving till it is human.

Greg Lotko: Then are we hitting the mark with all this technology? Are we able to deliver the value to society and to business?

Reg Harbeck: I think it’s far more interesting when we miss the mark. My brother used to wear a button that it said, “The differences between genius and stupidity is that genius has its limits.” But my attitude is that in fact it’s our stupidity that’s our greatest strength we forget that. When an animal makes a mistake it’s not being stupid, it’s just being functional.

Greg Lotko: And we learn the most from our mistakes.

Reg Harbeck: Yes, exactly.

Greg Lotko: As long as it doesn’t kill us.

Reg Harbeck: True.

Greg Lotko: That’s why they came up with the phrase what doesn’t kill you makes you stronger-

Reg Harbeck: But you know-

Greg Lotko: … because the ones that died couldn’t talk about-

Reg Harbeck: But everybody else learns from their example.

Greg Lotko: Absolutely.

Reg Harbeck: Maybe your purpose in life it to be an example. But Isaac Asimov said, “Most great scientific discoveries are not heralded with eureka, but with that’s funny.” And it’s those mistakes that often lead, I think of post-its and other inventions like that they’re intended one way, and they turned out-

Daniel Newman: Lots of drugs were discovered by accident.

Reg Harbeck: Yeah, absolutely penicillin. So on the one hand, I think careful scrupulous planning and design is always going to be a part of great inventions. You think about Dr. Fred Brooks and the System/360 and he wrote first of all The Mythical Man-Month and then decades later, The Design of Design about how important design is to do something well. But on the other hand, even to get to the idea of something being possible often requires a lot of mistakes. And so I think the moment we try to put a bar, you know we must meet this level to have achieved something that that itself in some ways is a mistake.

Greg Lotko: So what’s our gut check? What’s our gut check of when technology, a new technology comes around how do we think, is this going to make us more human or are we in the mistake part and we have to get to the next step? How do you identify that?

Reg Harbeck: My three principles have always been beauty, truth, and love, these are the essential things as human beings that we use as our essential measure for whether something really has a long-term value to us as humans. These are the essential measuring rods if you will, for whether we recognize something as genuinely part of what we’re doing. And so when we refer to something as being artificial part of what we’re doing is differentiating it from something that’s real that meets those standards. There’s something about it that isn’t quite good enough, and so we have to keep moving forward.

Daniel Newman: Yeah, it’s interesting. So let’s tie this back to the main part of The Main Scoop, the Mainframe. It’s a product that oftentimes, and I’m the voice of reason here that has to deal with being kind of the butt of the Mainframe. Isn’t that dead old? And obviously no, it’s not. But my point is, so there’s an enduring value to technological innovations that actually hold the test of time. You like to talk about the next thing being the panacea that’s kind of one thing. And by the way, what’s old is new is a common thing because I joke about generative AI. I say you know, CliffsNotes anybody else use CliffsNotes to pass tests in high school about books they read, I mean? But the point is kind of what’s old is new again, what’s old and trusted and resilient. I mean planes fly for 30 or 40 years and by the way they haven’t changed at all. If you look at the actual look of an airplane over the last 30 or 40 years, they barely look different. But the Mainframe has been a staple.

Greg Lotko: The enduring concepts, the enduring technologies or qualities are the things that last, right? So I mean if you think about virtualization, if you think about zero trust with as an approach to security, they’re concepts that were in the Mainframe very early on. Other technologies came along that solved other problems really well, but they did then realize they needed some of these other capabilities, whether it be virtualization or the security or stuff like that. So there is the thread that the things that really do drive improvement of humanity or improvement of an experience, those core qualities and capabilities survive. Some other technologies may become additive or morph around it, but you pull all those things together and yeah, then that’s how we end up always talking about Mainframe, that it’s a component of IT. It’s got some core strengths, but it really is about-

Daniel Newman: The irony of the whole thing. And if you give me like five more minutes, I swear I was going to get to a question, but the irony of the whole thing is that here we are talking about kind of future state. We’re talking about this disruptive types of technology and Mainframe probably disruptive isn’t the word anybody would use right now, but it’s been very stable and core to all the disruptive capabilities in the market that we need. The industries in FinTech and healthcare, these industries still are entirely dependent on a Mainframe to do a lot of its core transacting securely, privately that people are dependent on. So where’s the humanity in not evolving? Where’s the humanity in staying with what works and not disrupting?

Greg Lotko: You’re implying that Mainframe hasn’t evolved. You’re-

Daniel Newman: No hold on, I wasn’t saying that.

Greg Lotko: No, but you talk about not moving-

Daniel Newman: Iterative versus innovative.

Greg Lotko: I would agree evolutionary versus revolutionary in general, right? But if you look at the speeds and speeds, the transactional throughput, the stability, the measures and strides that Mainframe, how it has evolved, there’s a reason it’s around because it has continued to advance at a staggering rate. Now the other reason it’s still around is because of opening up that platform and the recognition that it should be tying into other technologies.

Daniel Newman: It’s like engines, like there’s still combustion engines. They still function much the same, but the amount of efficiency you get out of one has changed greatly. And so aerodynamic, things like that, I get the point where I was really trying to get on this humanity side of things is that staying with what works, there’s got to be some fairly strong-

Greg Lotko: Foundations.

Daniel Newman: … foundations in the fact that people struggle with change and people want stability. And I mean, where does that fit into all this when we’re constantly disrupting ourselves to the point where, how does that help society to provide some stability?

Reg Harbeck: Well, I think the term it works is such a core term for our understanding of what works. And one of the beauties of the Mainframe is it manifests the real meaning of the word legacy as compared to the way it’s being misused by people that are trying to sell something else is that it’s established itself, it’s become a legacy of something of value that works. And so as it continues to grow and develop and advance in so many ways, it’s founded on some basic solid principles. And it’s sort of funny that it’s called 360 because in so many ways, just like the wheel 360 degrees and they invented the wheel when they invented the Mainframe. And so you’ve got all that solid functionality and let’s look at society and say, how many things around here do we rely on and don’t even notice we take for granted?

You know most laws, the way traffic works, there’s over and over again all these different things that have just been baked into how we do life so that we have the ability to move forward in other ways. And so the Mainframe increasingly is baked into that information technology that works so we can take for granted. So we can continue to explore and grow in all kinds of ways, including on the Mainframe. Because we’ve got something that’s foundational, that’s an excellent legacy that works and that we can build on.

Greg Lotko: And again, I like words, right? So you think about the word legacy. I think in today’s day and age people will hear that word and some of them think about the silver spoon, the entitlement, getting something handed to them-

Daniel Newman: Michael Jordan.

Greg Lotko: … But we’ll get back there. But what legacy is supposed to imply and what you’re supposed to get from it is that it’s built on the foundation of the past, that it brings those qualities forward and it continues to evolve and be additive relative to the human condition. That’s what, when we refer to somebody as a legacy, we’re assuming they’ve gotten all the great family values from there. And where are we going with Jordan?

Daniel Newman: I was just saying that legacy can be perceived as a negative or a positive legacy of something can mean it’s the best, it is trusted, it works. It was the person you wanted to take the last shot or the technology that you depend upon to connect the world and to make sure that our systems are up and running and that that traffic system works. This stuff matters.

Greg Lotko: It does.

Daniel Newman: It does. So other than the fact that he said something about the wheel and the Mainframe being invented at the same time, I think he was saying it in-

Reg Harbeck: So 360 degrees.

Greg Lotko: He talked about 360 full scope.

Daniel Newman: I know, but he put it in a sentence where he said the mainframe and the wheel. And I just think we should cut that out as a clip because it was so fun that you know-

Reg Harbeck: It’s a wheel thing man.

Daniel Newman: … even I can use this, the wheel thing?

Reg Harbeck: Yeah.

Daniel Newman: Did you say that?

Greg Lotko: 360 degrees.

Daniel Newman: What a dad joke.

Reg Harbeck: Yep, I do those.

Daniel Newman: I love it. So hey, I want to thank you so much Reg for joining us here today.

Reg Harbeck: It’s a pleasure.

Daniel Newman: It was a lot of fun. It’s always really interesting to go down a little bit of a path that’s not hard tech, but that actually everything in all roads lead back to tech and you know Greg we don’t have to agree on everything.

Greg Lotko: It’s that generational divide.

Daniel Newman: But I’m usually right. I heard something about explaining things and being right in the same sentence, and I’m usually right, but every once in a while I’ll give you one.

Greg Lotko: And even a broken clock is right twice a day so I’ll give you that, Dan.

Daniel Newman: And there we have it. So thank you so much.

Greg Lotko: Join us for the next Main Scoop.

Daniel Newman: Thank you everybody for tuning into this episode of The Main Scoop. We appreciate you joining us. Hit subscribe join us for all of our episodes. We’ll see y’all really soon.

Greg Lotko: He’s Daniel Newman, I’m Greg Lotko. I got the last word.

Daniel Newman: Always does.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

The Six Five team discusses Microsoft & BlackRock Sitting in a Tree
AMD Discusses Its AI Acceleration Strategy, Growth in Data Center GPUs, and Future Roadmap at the Goldman Sachs Communacopia Conference
Steven Dickens, Chief Technology Advisor at The Futurum Group, shares insights on AMD's rapid AI-driven growth, strategic acquisitions, and the company's ambitious vision at the Goldman Sachs Communacopia Conference.
Keith Kirkpatrick, Research Director at The Futurum Group, shares his insights on the proliferation of AI agents, and discusses the key elements vendors must consider to drive customer adoption and additional revenue from these new offerings.
Ford Unveils Broader Electrification Plans With New Commercial Vans, Pickup Trucks, and Cost-Efficient Battery Solutions, Prioritizing Profitability and Sustainability
Olivier Blanchard, Research Director at The Futurum Group, shares insights on Ford's expanding electrification strategy, focusing on profitability and sustainability with new commercial vans, pickup trucks, and hybrid SUVs, as well as Ford Pro’s popular suite of solutions and services.