What is User Privacy, and Does It Actually Exist?–Futurum Tech Podcast Episode 043

On this edition of the Futurum Tech Podcast: Tech’s privacy problem, is Facebook moving us towards a solution or murkier waters, Apple and Qualcomm post quarterly earnings following the settlement of their global litigation. Even E-cigarettes get into the data collection business, now what? US-China trade talks, why did the US loosen its focus on IP protections, plus a report from Dell Tech World and the SAS Global Forum. Those stories and more coming up on this episode of FTP.

Our Main Dive

When it comes to privacy, the tech industry has pretty much walked wherever they wish. But in the wake of repeated data hacks, ethical mis-steps, and a growing call for privacy standards, many firms, like Facebook, claim to be on the bandwagon. But is this real? Or is it a way to fend off regulation? And btw, just what IS privacy, and is it the same as trust?

Our Fast Five

We dig into the week’s interesting and noteworthy news:

Tech Bites

Did the US just drop privacy and data theft concerns to strike a trade deal with China? We sure hope not.

Crystal Ball: Future-um Predictions and Guesses

Will Facebook be successful in transitioning to a “privacy” business model that keeps regulators at bay?


Fred McClimans: Welcome to this week’s edition of FTP, the Futurum Tech Podcast. I’m Fred McClimans, your host for this edition and joining me as always, my colleagues at Futurum Research, Dan Newman and Olivier Blanchard.

Gentlemen, welcome, how are we today?

Daniel Newman: Happy Friday, I’m excited to be here with you three, oh wait us three. It’s three of us.

Olivier Blanchard: Amazing.

Fred McClimans: Us three, yes, we’re back to three again. For our regular listeners, as you know, as analysts in the tech industry, we travel a good bit, and on occasion it’s two of us. But this week we are back full strength to all three and we have a very interesting show today. We’re going to tackle the issue of privacy, what it is and what it isn’t, and why some of the announcements that we’re hearing out of Facebook may not be as private as Facebook would like you to believe. We’re also going to take a look through our Fast Five, we’re going to highlight our Tech Bites winner of the weekend. Of course we’ll be doing our Crystal Ball prediction. But before we begin, as always, we need to remind our listeners that the Futurum Tech Podcast is for entertainment and informational purposes only. There’s nothing that we’re going to say here today that should be construed in any way as advice for what you should do in the stock market, even though we will talk about earnings and companies in the market today.

So, with that, gentlemen, our main dive today, privacy. That’s a topic that we’ve talked about a good bit here on the podcast and with good reason. It’s important. It’s important not just to the consumers, the users of social media sites, of apps, of software services or ride-sharing services out there but it’s also important from a larger perspective.

We as a global society, we have to decide how much privacy we’re actually willing to expect and how much lack of privacy we’re willing to grant to providers of information services. Information services that increasingly like Facebook, YouTube, anything where you’re not paying for it, where the user actually is being monetized, they are the product, they are the customer.

So this week, building on their recent history of embracing the word privacy, Facebook has come out at the F8 Conference and they have announced a new, revitalized focus on privacy, including new apps, reworked apps, new WhatsApp, new Messenger, and a new Facebook, essentially, which is rolling out now in the mobile devices and will be rolling out later this year. This also includes an emphasis on moving people from the timeline or the newsfeed aspect of Facebook into what Zuckerberg is calling their secure and private apps out there, the groups. The one-on-one communications using Messenger to communicate securely and privately and they’ve talked a lot about how they’re going to use encryption and how they’re going to make that really tight for the user. And as I mentioned with groups, they’ve talked about taking people off the newsfeed and moving them from that sort of broadcast aspect of Facebook into individual groups that are tailored to their interests. Again, those groups, supposedly much more secure. They keep using the word private, they use encryption out there.

But at the end of the day, I listen to everything that Facebook is saying, and it just doesn’t come across as being private. It doesn’t come across as being secure. In all of these applications here, Facebook is still going to monetize user data. And nothing that I’ve seen out of Facebook even comes close to addressing I think the biggest aspect of privacy, or lack of privacy with Facebook and other sites, and those are the user data breaches. That’s a big issue. When we see Facebook giving information to a Cambridge Analytica, or when we see Facebook accidentally posting user data onto AWS, or when we see somebody hacking into the system and extracting data from Facebook or any number of sites out there, that’s the real breach. That’s where the user trust is really lost. So, you know I look at these announcements and I’ve got a question, you know, what do we really mean by privacy? Because on the provider side, whether it’s Twitter or Facebook, YouTube, Google, take your pick, they are creating a different definition of privacy.

They’re talking about it from a you can have a private one-on-one conversation. They’re not talking about it in terms of actually securing and locking down user data.

And I think that’s a big issue here and one that’s going to land these guys into hot water with increased regulation. Regulation that Facebook seems to be embracing. In fact there was one message or one news article out there that I read this week that actually talked at length about Facebook actually be willing to accept a government approved board member. Or at least a government improved- I think the way they put it was the privacy or security officer in there. And while that’s an interesting step, at the same time I kind of look at that and go, that sounds like a political approved- I’m trying to think of the right word for that. It just sounds wrong, having somebody that’s politically approved inside Facebook trying to manage what’s private or not. It sounds a bit like censorship there.

Olivier, what are your thoughts on this? Do you see this split in privacy, the definition, and a widening gap between what the providers may mean and what users may be expecting?

Olivier Blanchard: Yeah, so, I think that Mark Zuckerberg’s definition of privacy is not the same definition of privacy that you and I and other Facebook users might have, or might even need. And I find it very unfortunate that the focus of F8s and obviously Facebook’s focus, as Mark Zuckerberg outlined at F8 this week, and not talking in terms of we as the company but I as Mark Zuckerberg, CEO, wasn’t led into by discussion about trust. He talked about a privacy built company, not a trust built company. And I don’t think that you can realistically, or even operationally, have privacy without trust. Not in the sense that we as users understand privacy. And so, what I see with what Mark Zuckerberg is trying to build for Facebook when talks about privacy isn’t data security, it isn’t really a discussion about how to minimize fraud on the platform, it isn’t a discussion about how to minimize abuse and hate speech on the platform, or even really data security or the abuses that happen through Facebook and sometimes by Facebook of data security.

What he outlined this week is essentially closing conversation and hiding them behind walls. He was kind of talking about turning Facebook a little bit more into Reddit groups, right, where you’re going to build walls around conversations, communities and groups so that they can have private conversations. But the problem with that is no one’s asking for this. No one’s asking for more private conversations, that’s fine for Messenger and WhatsApp. For the overall Facebook experience, what you need is actually more transparency, not less transparency. You want instances of fraud, hate speech, data issues, doxing to be out in the open so that they can be visible. You can’t identify them, you can’t spot them, you can’t measure them, and you can’t address them, you don’t even know that they’re happening if all of the conversations and interactions are hiding behind privacy walls anymore.

And so, by introducing this new vision of Facebook, what Mark Zuckerberg appears to be talking about is a new model in which advertisers and other users could target other users individually, specifically with messaging, with ads, with whatever, behind these walled gardens, out of sight of regulators, out of sight of law enforcement, out of sight of whistleblowers, out of sight of activists. And so it feels to me like he’s moving in the opposite direction and it could be an effort to shield the company from the type of oversight that he is currently being threatened with. So from a PR angle, he’s appearing to be all for regulation, calling for more regulation, calling for more oversight and more transparency but in practice, he’s appears to be moving the company more towards opacity, towards walled gardens, towards private communications that cannot be monitored and measured for those types of abuses. So, I’m getting mixed signals here. It doesn’t really make a whole lot of sense.

Fred McClimans: Yeah, you know it’s interesting. When we talk about the shift there from the timeline into the private groups or the private messaging, as you mentioned we still have that issue of inappropriate content being placed there and the only way I see them, you know really addressing that behind that façade there is for Facebook to actually monitor those groups a lot closer. So, you may have what you think is a private group and in Facebook parlance that really means that the communications are encrypted. But again, the only way they can effectively police, you know hate speech and inappropriate content there is to peer behind that façade and more closely monitor what’s taking place in the groups. So, I just don’t see that as a win-win.

And certainly though, if you’re looking at Facebook continuing to monetize through advertising, again, there has to be some aspect of Facebook that’s monitoring and watching what you’re doing and what you’re saying there. Now, Dan, we’ve talked a lot about the concept of digital trust in the marketplace, you know that aspect of trust that a digital brand creates and conveys and builds a relationship with their consumer that the service is reliable, it’s what they expect, it’s secure, it doesn’t lose data, it doesn’t share inappropriate data. What are the conversations that we had in the past, you brought up the idea of a trust continuum, the idea of trust being linked to sort of the brand acceptance, or adoption out there. Do you see a play here with that? Does this trust continuum model get impacted negatively for Facebook by this recent move?

Daniel Newman: I don’t know if it could be more negative than it already is, Fred, I think that continuum’s very real and we’re seeing regulators and governing bodies starting to put more pressure on companies. I agree with a lot of what you and Olivier have said about what’s actually going on behind the scenes. I will layer one more thing. Olivier, you talked a lot about concerns about creating walled gardens and sort of smaller groups that lack oversight but I’m not entirely sure that’s any different than what’s already been happening. I think you’ve always been able to be in groups, private groups, eliminate certain people from seeing.

I think the real question is is what Facebook is touting as privacy versus what’s being executed as privacy. Who are they keeping it from? Who are we being private from? So you have privacy from whom? And that’s really a big question right now, right? So, is Facebook keeping the encrypted data from others, is it encrypted for them, do they get to see the data, are they able to, you know, peer into individual groups and chats, have a way to moderate and identify and use algorithms and machine learning to potentially identify a terror cell that’s using a hidden Facebook group as a way to communicate, because you would hope they can. But even for more benign causes right, advertising as of today. What is the on ramps and off ramps? So I talk a lot about that as part of the continuum.

The continuum isn’t about making everything private, the continuum is about transparency and flexibility. So, I like in the experience too, I’m a huge soccer fan or football fan if you’re listening in Europe or anywhere else in the world. In the future, with augmented reality and real time and 5G, you might be able to walk into a stadium, opt-in to a series of augmented or mixed reality experiences, or live highlight plays that will be sent instantly to your device so you can see up close and personal the play if you’re sitting up in the nose bleeds. Well, when you go into the stadium, you’re going to have to opt-in to that experience. And the way the entire privacy spectrum works right now is you opt-in and you basically never get out. And so once you’re in the system, and it’s for instance right, you say, “Well when I delete my Facebook account today, am I really gone?” Is all that data really no longer in existence? Because as far as I know, I think it still is in existence, I still think all that data is out there and I have no reason to believe otherwise. There’s not proof that when you delete yourself all that data really goes with you.

Same thing now, when you opt-out of stuff, are you really being forgotten? Do companies really have policies to completely eliminate all the data they have on you and have collected on you? And what about the data they’ve sold to others that was their data that’s now other people’s data that you had no idea you were opting-in to. So the systems, the guardrails, the boundaries, there’s just nothing in place right now to actually slow down the proliferation of data abuse. So you opt-out of one thing, you’re into another. This is a really huge case study of, A, what are these companies actually saying versus doing, and B, Fred, you want that first real application for Blockchain, it’s going to be digital custodianship. We have to build and take back control of our use, our data, opt-in, opt-out, on ramp, off ramp, in very, very quick and flexible manner. And this is a huge undertaking but I don’t see another technology that’s going to be more capable of doing this and until we have that kind of custodianship where we actually own all of our own data and then we volunteer it into experiences, we’re going to continue to become the victims of data abuse and data theft, and I don’t see any way around it.

In the coordination to get that done by the way, is going to be of seismic proportion. And I have no idea how it will actually happen but what I can tell you is everything we’re hearing right now is lip service. Olivier, you wrote the piece on the blog, on the Futurum Insights blog, I recommend everyone check it out, total lip service. Everything we’re hearing from most companies about their concerns about privacy, lip service. Question is, when are we going to be more transparent, when do we get more control, when do we get on ramps and off ramps, and how do we really get off the ramp when we’re done experiencing any brand that we choose, and as of right now that type of exit doesn’t exist and until it does privacy is a myth.

Fred McClimans: Yeah, you know, it certainly doesn’t exist in the US here. I think that the GDPR regulations in the EU, they’re a step in the right direction and there is a mechanism in place for individuals to at least go to a provider like Google and say, “Look delete me from the search,” mechanism. Delete the data you have there. Just this week, Google announced that in some regions, they’re starting to give users the ability to automatically delete their Google history, in just selected areas of the history but automatically delete that, like on an every three month basis. It purges there.

But that still doesn’t address that aggregate data that you mentioned, Dan, that’s been sold, that has been accessed by others, you know pulled out of the Google system. While I think that’s a good step, I think the bigger challenge here is that we don’t have guardrails because we never built roads in place. We never defined what it is that is acceptable or not acceptable, and now we’ve got a significant portion of our global economy that is based on this wild west kind of approach. And in order for us to actually reign that back in and put some structure in place that is both beneficial to society and still provides for the infrastructure, the providers, the economic engine that we have in place today to keep operating.

I think that’s really a challenge there because the way I see it, in an ideal, perfect world, the mechanisms that we have in place would actually be very detrimental to the business models of a YouTube, a Twitter, a Facebook and so many others. I think that’s a significant issue that we need to address because the situation we have today is simply not workable at this point, everyday there’s another breach, everyday there’s another example of something that has gone awry in the digital information economy that we need to correct. Because right now it looks like we’re almost ready to drive off a cliff in some areas.

So with that gentlemen, let’s go ahead.

Daniel Newman: That’s just something I want everybody to think about as they walk away from this conversation and the entire conversation, Google says you deleted it, where did it go? Is it really gone? And I’m not just talking about who other third parties, I’m saying, is it sitting in cold storage somewhere, in some data warehouse, in a data lake, that just no longer TINs the algorithm that they’re using for search? Or is it gone? Do they remember document destruction, you’d store your files information, the old information services.

Olivier Blanchard: That’s the thing, it’s not because even the right to be forgotten, right, which we didn’t really talk about, which is kind of this right to remove yourself from Google searches, and notably there was a case where this guy had a criminal conviction that he wanted removed from Google searches, and he sued Google for the right to be forgotten, which is fine. Those types of discussions, fine, Google can artificially, or can mechanically do something to keep you out of those searches. It doesn’t mean the data’s not sitting out there somewhere on hundreds, if not thousands, of servers on whatever.

So there’s still a difference between what’s visible and searchable on the Web, or inside an app like Facebook, and data that’s actually been stored, collected, filed away somewhere. So the data is not destroyed, it’s not gone. It is somewhere, it’s just burning bridges between your terminal and the data doesn’t make the data disappear. And unfortunately we don’t have a way to make that disappear.

But we should have a discussion at some point in terms of fixing this about the opt-in model of the user owns their own data and they should be able to do with it what they want, and so with a company like Facebook, you could have a model whereby Facebook, now that it’s not growing anymore, we’re all addicted to it, it’s already scaled, there’s really no reason to be completely free. You could have a transactional model by which users can be charged, say like nine dollars, $9.99 a month, right, for their normal account use. But, and that’s with no data collection. But, if you’re willing to trade some of that data for a lower price, you could work your way down by, every time you opt-in, every time you check a box of yes I’ll allow Facebook or whatever platform to collect this data on me, and this type of data, and this type of data, and use it, sell to third party providers, that price goes down and it can go down to zero.

And to a certain extent, you could even have Facebook pay you for access to some of your premium data, stuff that’s kind of above and beyond the normal marketing, demographic, behavioral stuff. And that might be acceptable and so maybe we should- The issue isn’t so much to stop companies from collecting our data, it’s to give us control of our data again by making these things opt-in and by giving us something back for that value, whether it’s actual cash, or virtual cash, or bitcoin, or some kind of semi or completely free access to features in applications. And that seems like a business friendly and a consumer friendly solution or compromise to all of this.

Fred McClimans: Yeah, you know Olivier, there are those that would argue that when you accept the Terms and Conditions blindly, without reading, which is what 99.9999 percent of people do today, you are in essence opting-in to the service, and you’re opting-in to them being allowed to manipulate your data. And I don’t think that most people realize so many of the apps that you have running on your cellphone, they’re location based. As soon as you connect that in, you’re providing a wealth of information about what you’re doing and even getting to that point where we strip that back I think is going to be a challenge.

Now in the case of Facebook, the argument for Facebook not going to a flat fee or a flexible fee as you outlined model, or having a clean service, is that once Facebook does that, they have defined the boundaries regarding how much they can monetize an individual user. And when they do that, all of a sudden these massive projections for what they can do and the new ways that they can slice and dice the data and advertise into the service, that goes away and all of a sudden you have a very predictable business model, whereas now Facebook still has a lot of potential upside on the advertising space.

The other thing that I would caution is, we saw this with sites like Hulu, and Spotify. Hey, look, pay your premium price, register for the service, and great. The first six months it was ad free and then they said, “Well, hey we’re going to put back a few advertisements in here.” And the only way they can make those advertisements relevant is to access and store and track more of your personal data.

I like that model, what you’re describing, I’m just not sure it’s practical in today’s marketplace.

Olivier Blanchard: Just a quick rebuttal before we move, because we need to move on, but I just want to outline the fact that the value assigned to a customer’s data, between the customer- so that transaction between the customer and Facebook about the value of that data, doesn’t have to be the same or doesn’t have to be equal to the value assigned to that data by the people buying the data from Facebook. And so you can have price X, right, like say nine dollars a month is the assigned value, or the value that I assigned to my data when I sell it to Facebook, or when I agree to let Facebook collect it. That data might be worth 30 dollars to Nike when it purchases it from Facebook.

So the two don’t have to be symmetrical, so I don’t think that’s really a concern that we need to have. I don’t think it’s as finite of a value as your first point, or your first argument claimed. So I would encourage people to think a little bit more broadly about. And on the other thing, I think removing the ads from the experience is certainly something that can be included in a premium account, or some kind of additional fee, no fee service but removing ads from the experience is not the same thing as collecting data.

I think that opting-out of ads is not the same thing as opting-in to data collection. And I think those things- they’re parallel but they’re not the same.

Fred McClimans: One thing I will say on this as we wrap up this section here is at the end, I agree with you completely, that the only way this system works, the only way users can get control and transparency of their data, is to give them the ability to actually own and track where their data is, where it has been distributed. And the only approach that I see close to being able to do that is the one that you mentioned, and that is Blockchain technology. I do think that is a potentially phenomenal way for Blockchain to really have some significant impact in the market.

But we are a good bit away from that today.

So with that, let’s move into our Fast Five. Five items that we saw this week that kind of made us go, “Hmm, this is worth noting.”

Dan, let’s kick it off with you. You spent the week at Dell Tech World. How was that?

Daniel Newman: Well, it was a really solid event. Spent a couple of days there, had the opportunity to moderate their 5G panel, talking about 5G, specifically infrastructure in service providers which is a big topic. In the 4G era, Dell’s name never would’ve come up. But for the sake of the Fast Five, let’s talk fast about what a couple of the key announcements were.

One, Dell announced a partnership with Azure. This was really big. Satya came on stage, this is really a VMware and Azure partnership to basically allow for VMware to be run fully in an Azure environment. So this was a big step.

The next thing they announced was the Dell Technology’s cloud, another very big step. A little bit of a involuted message, I’m looking forward to hearing them kind of break this down, because it was seven parts that made up the Dell Tech cloud. Everything from flexible consumption, which Fred you and I spent a lot of time talking about, down to workload provisioning and multi-cloud migration. Big topic though for the whole event was multi-cloud.

Couple other interesting announcements that they made was one they had a whole new series of storage and data protection solutions, so you mentioned data protection and privacy, well at the infrastructure level, Dell has a big interest in this. We heard Michael Dell talk about it a lot on stage all week. And then the last thing that kind of caught my eye was the Dell Technology’s unified workspace. And this was interesting in terms of workplace transformation because it’s really a full life cycle solution that helps automate, provision, deploy devices for users. It’s tracking analytics and behavior, so there goes our data again but it’s being utilized to help update the right software, get the right laptops and equipment in the hands of employees, secure their devices, deal with the BIOS on their devices.

So a lot of things going on but it’s designed that for large enterprises they have lots of provisioning to get the provisioning work out of the hands of IT and back into the control of the worker, so that no matter where you are, no matter what you’re doing, the work is easy to do.

Fred McClimans: Very good, very good. Olivier, you’ve been tracking some earnings, this week. I believe Qualcomm and Apple?

Olivier Blanchard: Yeah, so I was kind of interested in looking at how Apple and Qualcomm were going to be received this week so close after the announcement of their global settlement, where they agreed to drop all lawsuits globally. So it was kind of interesting, not a lot of surprises there. Apple announced a little bit of softer results than people expected. They’re overall sales were down five percent from the same period last year. And if I’m not mistaken, iPhone revenue by itself was down almost 20 percent, it was like 17 or 17.3 percent, year over year. And that’s important because iPhone revenue accounts for a little over 50 percent of Apple’s revenue for the quarter.

So with that said, the stock did really well, or investors reacted very positively to kind of these mediocre results, or not mediocre but definitely lacked luster for Apple. In contrast, Qualcomm exceeded expectations but guidance was a little bit guarded and so the stock actually went down a little bit. It was interesting to have Apple not doing super well but seeing the stocks shoot up, and Qualcomm do much better than expected and see the stock drop down a little bit. And companies-

Fred McClimans: Investors love massive tie backs.

Olivier Blanchard: Yeah, I know and you know what Apple is- I have to hand it to them, they do some things very well and managing the street is one of the things they do better than pretty much anything else. One thing I did notice though is, with dividends, they basically increased dividends five percent which exactly matches the loss in revenue for the quarter, so it seemed kind of like a nice, symmetric kind of answer to the problem. The issue with Qualcomm though I think is, as tech analysts we tend to look at things in product life cycle timelines, so we look at minimum 18 months ahead to know where a company’s really going and we really think in terms of one year, two years, five years, 10 years, as opposed to Wall Street analysts who might be looking quarter to quarter or year over year timelines.

What I tribute Apple’s strength to is just the continuity and the fact that Apple is going to get good again, iPhones are going to get good again. Just the thought that iPhones are going to have Qualcomm chips in them again is good news. Those probably won’t come until 2020 though but still it was enough to kind of carry the stock forward. With Qualcomm what we see is that 5G, which is a huge play for Qualcomm, probably won’t scale and won’t really start monetizing itself for another few years, we’re still very early in that game. So Qualcomm is going to be a huge, huge stock, I think, in my opinion, personal opinion, not advice, in the next few years but it’s still a little bit early for the next 12 months for that to happen.

The revenue from the new licensing agreement that it has with Apple, where Qualcomm chips are going to go into iPhone, really won’t go practically into effect until the chips are sold to Apple and Apple will most likely not create or start building, or manufacturing rather, a 5G Qualcomm powered iPhone until late 2020. So we’re not going to see a boost from that Apple partnership in the next six months.  There’s other features as well but I think that it’s important for people to realize that both stocks, both companies are going to see major revenue increases, I think, starting in H2 of 2020, just not yet.

If you were a little bit scared by Qualcomm, or even disappointed by Apple’s hat trick, I would definitely just be patient and wait and see what happens in late 2020, because that’s when things are going to start turning around for both companies, I think.

Fred McClimans: Yeah, you know I think from an equity perspective, those two companies, at least based on my experience in the equity research side, I would tend to evaluate them a little bit differently, one being a component supplier and the other being a complete end system product going in. But I think also here, with Apple, when this lawsuit was settled, what it did is it solidified the liability that Apple had, and once that liability is removed, everybody breathes a sigh of relief and they can move forward because of course Apple, especially with the shift into services is much more than just the iPhone product, at this point.

The converse of that, however, is that for Qualcomm, this lawsuit represented an upside, and now that we understand what that upside is, everything kind of resets so I can understand why Apple might have gotten a bigger boost even though they’re the one that’s paying as opposed to Qualcomm, the one that’s receiving this case. This week, while Dan now, while you were at Dell Tech World, I was at the SAS Software’s Global Forum in Dallas, they’re big annual user conference event there. It was a great event, very well received. A lot of excitement there amongst the end user community and I had a good opportunity to talk to a large number of enterprise customers out there and just get a sense for what they were thinking about the show and what they’re thinking about technology adoption in general.

They, like myself, I walked away from that show really impressed with the demonstration and the commitment to the cognitive technologies, the predictive analytics, the artificial intelligence, machine vision and machine learning, that I saw at the conference here. SAS is really going all in in those technologies and I think they’re doing it in a right way, they’re directly applying these technologies in ways that can provide very easy to implement demonstrable value to the customers.

There were two things that kind of struck me at the show, just sort of common threads across just about everything that they announced there at the show. The first was that they have, whether it’s planned or not, they have put a very strong emphasis on risk mitigation and a good example of that is one of the tools that they brought out at the show here, where when you’re doing budgeting, when you’re doing forecasting, whether it’s for sales targets, for production or just about any aspect of business there’s always that forecast aspect in play. What they’re doing is they’re using machine learning and predictive analytics to offer suggestions to your forecasts as you create them. It takes about 3000 records to kind of build it up and probably a good six months or so to get enough data for the average enterprise to start to apply this technology here. But when you enter in a forecast then, it looks at all the historical data that you have, as well as looks at the person who’s inputting the data and their accuracy, historically, and makes recommendations saying, “We see you forecasting this in this particular area. Based on past performance here, we’re going to suggest that you adjust that number up or down and here’s why.”

I think that’s a great tool that based on the conversations I had with the representatives from SAS, you could get an additional five to six percent accuracy out of your forecasts and I thought that was great, very much focused on risk mitigation there. They have a similar tool in the healthcare space as well for tracking health outcomes which I thought was pretty slick. The other thing that kind of struck me across pretty much all of their products there was an emphasis on data visualization. Very often when we talk about the complexities of data and we talk about the tools that people are using today, historically for a lot of people data meant massive Excel spreadsheets and what SAS is doing is they’re taking that data and they’re putting it into a very very easy visual format that allows you to digest things really quickly.

In essence, they’re hiding a bit of the Tech behind the scenes, behind the cover non that. But it was a very good show and as a I mentioned they’re all in on the AI tech and that was great to see and with the risk mitigation and the data visualization tools that were pretty consistent across the board. I think they’re definitely headed on the right path there.

So, Olivier, let’s go back to you for your second Fast Five, with Juul. And you’re going to have to spell that for us.

Olivier Blanchard: Juul. J-U-U-L. Right, the E-cigarette company. It’s an interesting little thing that caught my eye because, and I considered almost making this the Tech Bites today because I’m a little bit annoyed with it. So Juul announces that it’s going to create an app for users within the context of Juul, again, an E-cigarette company, trying to position itself as moving into healthcare, quote on quote. And the app is supposed, at least on the surface, help Juul users, customers, essentially manage their vaping, I assume. Kind of like managing your screen time on a smartphone but instead managing your vaping time, I suppose.

Fred McClimans: Well, Juul has been very strong in promoting their product not as a recreational tool but as a way to stop smoking cigarettes, so I see a bit of that tie in there.

Olivier Blanchard: I understand the marketing and PR angle there. I used to be in marketing to so obviously whoever is managing PR for Juul deserves every dollar that they make and probably need a raise because they’re doing an amazing job at what they do. However, as a not really the most cynical guy in the world, but as someone who understands the tobacco industry and now this tobacco adjacent industry, and I’m sorry if Juul PR people and lawyers might disagree with my opinion on this matter but it is what it is.

In the context of data collection and privacy, I see this as more of a skim, more of a surface kind of- yes the app will do what Juul claims it will, it will in fact let Juul consumers, users, better manage their health and better manage and understand their vaping habits. However, the app also appears to be, at least to me, a mechanism by which Juul will be able to collect very important, very useful customer data directly from their E-cigarettes, from their phone, such as how often they use the product, when and where they use the product, in whose proximity they use the product. This is the kind of market data and market intelligence, gradually collected from each user, that is absolutely gold to any company, whether you’re selling soda, or sneakers, tennis rackets, whatever, it doesn’t matter. And Juul is no different.

So it’s actually a brilliant strategy. The execution is actually really solid. From a business standpoint, if I worked for them, this would be a huge positive check mark on my resume. However, for consumers, I’m a little bit concerned that this data collection effort by Juul is not particularly customer centric and not particularly transparent either. I mean, it’s accidentally transparent, not deliberately transparent. So I just wanted to bring that up and if you’re in marketing for one of these companies, understand that what works well internally, might not work well externally but again this is just another example of how technology can be used to collect user data where the users opt-in but they don’t necessarily understand what data’s being collected and why.

Fred McClimans: Got it, got it. That’s an interesting one. These days just about everybody can be viewed in some way as being a tech company and I guess this is Juul’s foray into that space. This is going to be interesting to see how well it works out. I just can’t wrap my head around Juul’s really being a healthcare kind of provider there. Dan, take us home for our Fast Five, this week there were a couple of Blockchain announcements out there. AWS announced their Blockchain service and Microsoft announced theirs, their managed Blockchain service. Tell us about it.

Daniel Newman: Well first of all I’ll wrap up our slow five, our extremely long slow five this week. Thank you guys for turning every single one of our Fast Fives into a full blown main topic dive. All right, IBM longed the leaders in the Blockchain space and we’re almost alone when it came to enterprise Blockchain right, Blockchain’s been a cryptocurrency topic. Well this past week, both AWS and now Microsoft with Azure, have announced a Blockchain service, manage service, as a service, Azure to be the third of the group and this is right ahead of their Build Conference, which I will be at this Sunday, Monday. So exciting to hear more about it.

Basically, they believe we are finally at the point where enterprise apps, and this has nothing to do with cryptocurrencies, they said, “We are not talking cryptocurrencies here. This is an enterprise service that is meant to help business build applications on top of Blockchain technologies.” It is integrated with Azure active directory and offers tools for adding new members, setting permissions and monitoring network health and activity. By the way, these kinds of launches are the first step in getting us to where I talked about with digital custodianship. Now, I think the move was highly motivated by a big customer opportunity. The first support ledger that will be run by Microsoft will be JPMorgan’s Quorum. And JPMorgan’s Quorum is a decisioning, secure way of supporting confidential transaction.

With having that big customer, and that big case, I think it finally propelled Microsoft to launch fully into this manage service space but for those of you who didn’t know, they do have a Blockchain development kit, they do have an Azure Blockchain, this is an extension of some of the, I’d say independent services that Microsoft has been offering on Azure for a while but now they’re going all in and if you want to utilize Blockchain for business or things like I mentioned, when it comes to confidential transactions and security, they have it natively offered in Azure. It’s both a sign with Amazon and Azure coming into this space that it’s going to get bigger, it’s going to put some new pressure on IBM but I think it also validates that IBM’s been here for so long and now it’s just going to start to see can IBM keep the momentum and how quickly can Azure and AWS grab new customers’ use cases and actually drive overall growth to the Blockchain market.

Fred McClimans: Dan, I think this is a big move here. When you combine this with AWS already building on IBM services as well as the services of others out there, Doit and a lot of the service providers that are offering managed services. What this really does is it takes the technical, underlying aspect of Blockchain out of the hands of the enterprise and says, “Look if you want to implement Blockchain, if you want to try a proof of concept, if you want to play around with it and see if it works right for you,” now you can do that in that managed services environment without having to worry about a lot of the underlying infrastructure. I think these are two great boosts for the Blockchain industry here.

That brings us to the lack of boost and our Tech Bites segment of the week. And this week I do want to talk a little bit about the privacy aspect here and we are going to stray a little bit into politics perhaps but what happened was, as we know, there is an ongoing trade war, a conflict of sorts, between the US and China, in fact between the US and a lot of countries out there these days. But a lot of the negotiations with China centered around sort of a unique aspect here, and that involves areas such as the theft of intellectual property and literally spying through technology as we’ve seen with companies like Huawei and ZTE in the past. Those aspects of the data privacy, the data theft, the IP theft, the cyber security, they had been part of the overall trade negotiations from the US position.

This week, however, we are seeing multiple reports out that the Trump administration is dropping that aspect of discussion from the trade negotiations in order to push a deal with China out by the end of summer. I’ve got to say, I’m very disappointed to hear that, this is very similar to the coin-driven approach that we saw with ZTE in the past where ZTE was hailed as this non-ethical company that was involved in potentially spying on data and all sorts of things, and at the end of the day, when we banned ZTE, it was resolved by ZTE paying a billion dollar fine and all was forgiven. I’d like to hope that that’s not the case here but this whole deal, it just seems like it’s coin-driven policy and that is just not anything I’m a real big fan of here today.

Olivier, I know you’ve been tracking this space and you’re very much up to date on what’s going on politically these days. What’s your position on this, what’s your thought? Is this really, truly a Tech Bites or is it even perhaps a Tech Double Bites.

Olivier Blanchard: Yeah, I mean it’s a Tech Bites, it’s a policy Bites. I’m amazed that’s it’s even [inaudible] right? At what point in negotiations did both parties come together and say, “Okay, we’ll give you this but you have to let us steal your IP or get softer on enforcement.” How is that even communicated in those negotiations? What struck me about that isn’t so much that we took a softer stance, it’s that it came up at all. This is something that on the Chinese side, as a matter of pride, is something that they should deny that they’re even doing, right? Not something that they should use as a lever to kind of gain an advantage somewhere else.

I’m definitely very disappointed that any US administration would not take a stance to defend the value of US intellectual property, especially with regard to technology, and especially moving to 5G. And knowing what we know about the Chinese rivalry with the US to try to gain control or at least leadership is a politically correct term in 5G technologies, the US kind of back stepping this a little bit and taking a weaker stance to try to push back against that effort especially in 5G and technology by China I think is not only unfortunate but it’s also ill-advised I think and potentially dangerous.

But on the other hand, it’s very difficult to enforce. So, the US’ calculation might have been that, look we’re not for it and we don’t want to take a softer stance but in reality, we have very few recourse’s to enforce our IP leadership or even our IP integrity in China. By taking the stance with regard to China, in practicality we’re not changing anything, we’re just giving them something they were going to get anyway, we’re just not posturing.

Fred McClimans: It’s interesting that you bring up enforcement there because I believe that the reports that were coming out of China this week after the trade negotiation meetings between the US and China earlier this week, one of the key aspects or the key differences between what was being said by the two parties was that the Chinese papers were reporting that their government was trying to place a much greater level of control awareness of monitoring of this trade deal but from a Chinese perspective looking at the US saying look, you’ve got to let us see exactly what the US is doing because we need to be able to enforce this deal from our perspective. Enforcement is always a key issue there.

Olivier Blanchard: One more thing, and I’ll just interject this because it just occurred to me that China has actually been cracking down on IP theft within its own borders, right. It actually has an effort and maybe this is just being misunderstood, or miscommunicated rather by the US administration, that by backing off from its demands, it’s actually just kind of like seeding that authority and just giving China more room to actually operate under its own new rules. It’s not actually giving anything away, it might just be saying, okay China you’re doing your own thing, we’re not going to press you on it, we’re going to back off and let you do your thing in good faith. That could also be what’s happening.

Fred McClimans: It may also be as well that the US, hearing that China was looking for a much stronger position to monitor and enforce the trade deal from their perspective, the US may simply have said, “Look, we’re not willing to give you that level of transparency, so let’s just call it a day and punt on this issue.” Dan what are your thoughts on this?

Daniel Newman: I’m going to keep it short but I just don’t see any cooperation taking place at this point. I just think they’re too far apart right now. In the future, we want to get closer, we’d like to see them work more driving more transparency, this is something that kind of goes back to where we started this whole show. How do we really make progress when you have so many different bodies, regulators, and like I said as data continues to move, who controls it, who owns it once it’s been shared, ever get it back. And I realize I digress a little bit from the topic itself but I mean like I said, short but sweet, I just don’t see this cooperation taking place anytime soon and I think it’s a nice idea but for them to ask the US to open up their ComOne, I just- I think it’s a dream. It’s kind of like getting Americans to drive cars fueled by ethanol. As my favorite comedian, Daniel Tosh said, “Ethanol: it’s a dream and a dumb one.”

Fred McClimans: Ouch, ouch. Well hey, yeah I’m just going to let that one die on the vine with this. So gentlemen, before we wrap up the show, it’s time to pull out our Futurum Crystal Ball. As we often do, we’re going to back to today’s main dive topic and the issue of privacy, specifically because it has been bantered around so much of late but we’re going to take at Facebook and the question I have for you, put on your little Carnac the Magnificent hat here, will Facebook be successful in transitioning to this new private business model to the point where they can actually stave off regulatory control or even some type of management approval oversight by the US government as they move forward. So, you know, to cut it clean and short, is this going to work for Facebook and can they keep the regulators out or are they doomed to regulation for the rest of their lives. Dan?

Daniel Newman: I think they’re inviting it. I think they need it. I think they’ve actually lost control. The bobsled is off the track, and I think they know it. They realize that their fate sits in the hands of the illusion that they’re cooperating. And if you heard what I said, it’s not actual plans to cooperate, it is the illusion that they plan to cooperate and the fact that people by and large are stupid. And I realize I sound like a cynic but as I said, we’re technical people. We’re in this business, we understand how it all works and we still don’t really know exactly what happens to our data. What do you think the average citizen thinks? What do you think Grandma that uses Facebook to share everything or the average Gen Xer who uses it to chat or as Olivier suggested, in an offline conversation that chooses to share the nine crushes they have. That’s really healthy, by the way Facebook, that’s great. Let’s promote infidelity as well.

But the point is I just simply cannot see them doing it on their own, I see them doing it but I feel like it’s all for show. They’ve had 23 or 24 scandals in the last two years it’s come across the desks of Sheryl Sandberg and Mark Zuckerberg. If they cared about this, it would’ve been done a long time ago. They don’t care. It doesn’t do them any good to care and so the best chance they have of gaining back people’s trust is to make it feel like, “Oh, we’re not the ones actually making these decisions, someone else is. So it’s a great marketing ploy. It’s the Juul. We’re really worried about your health, wag the dog right. So I say no, not a chance, and that’s going to stand and I’m right.

Fred McClimans: Okay, so Dan’s vote is increased regulatory presence that Facebook actually welcomes in and uses to create a magical illusion. Olivier, what’s your take on this? Will Facebook be successful and keep regulators out?

Olivier Blanchard: Well, let me tell you story. No I’m not, I’m not going to do that. But something, just to illustrate before I answer, I think you’ll guess what my answer is by the story. So this week, Facebook was kind of caught allowing this kind of allegedly fraudulent ad, campaign ad by the Trump campaign actually, to be posted on Facebook and some watch groups flagged it and said, “Wait, this ad that you’ve allowed, this political ad that you’ve allowed on your platform is full of misstatements and it’s exactly the sort of thing that you said you weren’t going to do.” And just to remind everybody, last year in 2018, in response to the scandals around election influence and fraudulent content on Facebook, Facebook promised to hire over 4000 human moderators to, in part, focus on trying to determine what political ads are fake news essentially and which ones are legitimate.

So they promised to hire thousands of human staff to kind of monitor this and to validate ads before they’re actually authorized to the platform. Facebook’s answer this week, when confronted with this ad, which by the way it took down and the campaign changed the wording and posted it back on, was the algorithm didn’t catch it and all we have is an algorithm looking at it. So here we are, in 2019 just months away from 2020, the next presidential election, and although Facebook has promised to make all of these changes to solve this problem that it itself created, there is no evidence whatsoever that any of those changes were made, that even a single person has been assigned to review these cases.

I, myself, by the way, was zucked, it’s a new term, it’s that popular it happens so often on Facebook, I was banned from Facebook for seven days because I used keywords that resembled key words that are used by white supremacists. For the record, I am definitely the opposite of a white supremacist on Facebook, and it’s happening to a lot of users. And so Facebook is relying on very poor algorithms to sort all this stuff out, even though it promised that it was going to use humans. Because Facebook is unable and or unwilling to actually address these problems and fulfill its own promises, I think it’s inevitable that the government and regulators will have to step in to force Facebook into different modes of operation and to create lanes and training wheels for the platform because it’s obviously not doing it on its own, and it’s going to have to if it’s going to be allowed to continue.

Fred McClimans: Yeah, that’s actually an interesting story there. I had not run across that this week. Guys, I tend to agree with both of you here on this, I don’t think that Facebook’s pivot to privacy is a pivot to privacy, I think it’s just a pivot to a different spotlight that doesn’t have as much light shining on it as anything else. At the same time, I don’t see any way that Facebook moves forward without some time of increased regulatory oversight. The one thing I am very cautious about is the idea of the government actually having approval over some type of a security officer or privacy officer within Facebook. That to me just does not sit well. It smacks a little bit too much of government intrusion from that level. But I think definitely we’re going to see some guardrails placed around Facebook, YouTube, Twitter and a bunch of others moving forward here.

So with that, Dan, Olivier, thank you very much for being part of this week’s edition of FTP, the Futurum Tech Podcast. I would like to thank our listeners and remind them that if you like the podcast, go ahead and subscribe to it. Share it with your friends, hit the like button, give us a great rating on iTunes, on SoundCloud, on Spotify, because we are there and many other places as well. And with that, hope everybody has a great weekend, we’ll see you next week for another edition of FTP, the Futurum Tech Podcast.

Disclaimer: The Futurum Tech Podcast is for information and entertainment purposes only. Over the course of this podcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.

Author Information

Fred is an experienced analyst and advisor, bringing over 30 years of knowledge and expertise in the digital and technology markets.

Prior to joining The Futurum Group, Fred worked with Samadhi Partners, launching the Digital Trust practice at HfS Research, Current Analysis, Decisys, and the Aurelian Group. He has also worked at both Gartner, E&Y, Newbridge Networks’ Advanced Technology Group (now Alcatel) and DTECH LABS (now part of Cubic Corporation).

Fred studied engineering and music at Syracuse University. A frequent author and speaker, Fred has served as a guest lecturer at the George Mason University School of Business (Porter: Information Systems and Operations Management), keynoted the Colombian Associación Nacional De Empressarios Sourcing Summit, served as an executive committee member of the Intellifest International Conference on Reasoning (AI) Technologies, and has spoken at #SxSW on trust in the digital economy.

His analysis and commentary have appeared through venues such as Cheddar TV, Adotas, CNN, Social Media Today, Seeking Alpha, Talk Markets, and Network World (IDG).


Latest Insights:

With the Introduction of Its New NPU-Powered Core Ultra Processor, Intel Launches the Era of the AI PC
Olivier Blanchard, Research Director at The Futurum Group, shares his insights on why Intel’s new Core Ultra processor is an inflection point for both Intel and the PC segment as a whole.
Avaya Experience Platform Is Making Strides With a Solid Cadence of Feature Rollouts
Sherril Hanson, Senior Analyst at The Futurum Group, breaks down Avaya’s announcement on its progress across multiple fronts and discusses the latest enhancements to the Avaya Experience Platform.
The Six Five Team discusses Amazon FTC.
The Six Five Team discusses Apple iPhone 15 defects.