Tag Archives: embraced

A conversation with Microsoft CTO Kevin Scott – Microsoft Research

Kevin Scott

Chief Technology Officer Kevin Scott

Episode 36, August 8, 2018

Kevin Scott has embraced many roles over the course of his illustrious career in technology: software developer, engineering executive, researcher, angel investor, philanthropist, and now, Chief Technology Officer of Microsoft. But perhaps no role suits him so well – or has so fundamentally shaped all the others – as his self-described role of “all-around geek.”

Today, in a wide-ranging interview, Kevin shares his insights on both the history and the future of computing, talks about how his impulse to celebrate the extraordinary people “behind the tech” led to an eponymous non-profit organization and a podcast, and… reveals the superpower he got when he was in grad school.

Related:


Episode Transcript

Kevin Scott: It’s a super exciting time. And it’s certainly something that we are investing very heavily in right now at Microsoft, in the particular sense of like, how do we take the best of our development tools, the best of our platform technology, the best of our AI, and the best of our cloud, to let people build these solutions where it’s not as hard as it is right now?

Host: You’re listening to the Microsoft Research Podcast, a show that brings you closer to the cutting-edge of technology research and the scientists behind it. I’m your host, Gretchen Huizinga.

Kevin Scott has embraced many roles over the course of his illustrious career in technology: software developer, engineering executive, researcher, angel investor, philanthropist, and now, Chief Technology Officer of Microsoft. But perhaps no role suits him so well – or has so fundamentally shaped all the others – as his self-described role of “all-around geek.”

Today, in a wide-ranging interview, Kevin shares his insights on both the history and the future of computing, talks about how his impulse to celebrate the extraordinary people “behind the tech” led to an eponymous non-profit organization and a podcast, and… reveals the superpower he got when he was in grad school. That and much more on this episode of the Microsoft Research Podcast.

Host: Kevin Scott, welcome to the podcast today.

Kevin Scott: Well thank you so much for having me.

Host: So, you sit in a bit chair. I think our listeners would like to know what it’s like to be the Chief Technical Officer of Microsoft. How do you envision your role here, and what do you hope to accomplish in your time? I.E., what are the big questions you’re asking, the big problems you’re working on? What gets you up in the morning?

Kevin Scott: Well, there are tons of big problems. I guess the biggest, and the one that excites me the most and that prompted me to take the job in the first place, is I think technology is playing an increasingly important role in how the future of the world unfolds. And, you know, has an enormous impact in our day-to-day lives from the mundane to the profound. And I think having a responsible philosophy about how you build technology is like a very, very important thing for the technology industry to do. So, in addition to solving all of these, sort of, complicated problems of the “how” – what technology do we build and how do we build it? – there’s also sort of an “if” and a “why” that we need to be addressing as well.

Host: Drill in a little there. The “if” and the “why.” Those are two questions I love. Talk to me about how you envision that.

Kevin Scott: You know, I think one of the more furious debates that we all are increasingly having, and I think the debate itself and the intensity of the debate are good things, is sort of around AI and what impact is AI going to have on our future, and what’s the right way to build it, and what are a set of wrong ways to build it? And I think this is sort of a very important dialogue for us to be having, because, in general, I think AI will have a huge impact on our collective futures. I actually am a super optimistic person by nature, and I think the impact that it’s going to have is going to be absolutely, astoundingly positive and beneficial for humanity. But there’s also this other side of the debate, where…

Host: Well, I’m going to go there later. I’m going to ask you about that. So, we’ll talk a little bit about the dark side. But also, you know, I love the framework. I hear that over and over from researchers here at Microsoft Research that are optimistic and saying, and if there are issues, we want to get on the front end of them and start to drive and influence how those things can play out. So…

Kevin Scott: Yeah, absolutely. There’s a way to think about AI where it’s mostly about building a set of automation technologies that are a direct substitute for human labor, and you can use those tools and technologies to cause disruption. But AI probably is going to be more like the steam engine in the sense that the steam engine was also a direct substitute for human labor. And the people that benefited from it, initially were those who had the capital to build them, because they were incredibly expensive, and who had the expertise to design them and to operate and maintain them. And, eventually, the access to this technology fully democratized. And AI will eventually become that. Our role, as a technology company that is building things that empower individual and businesses, is to democratize access to the technology as quickly as possible and to do that in a safe, thoughtful, ethical way.

Host: Let’s talk about you for a second. You’ve described yourself as an engineering executive, an angel investor, and an all-around geek. Tell us how you came by each of those meta tags.

Kevin Scott: Yeah… The geek was the one that was sort of unavoidable. It felt to me, all my life, like I was a geek. I was this precociously curious child. Not in the sense of you know like playing Liszt piano concertos when I’m 5 years old or anything. No, I was the irritating flavor of precocious where I’m sticking metal objects into electric sockets and taking apart everything that could be taken apart in my mom’s house to try to figure out how things worked. And I’ve had just sort of weird, geeky, obsessive tastes in things my entire life. And I think a lot of everything else just sort of flows from me, at some point, fully embracing that geekiness, and wanting – I mean, so like angel investing for instance is me wanting to give back. It’s like I have benefited so much over the course of my career from folks investing in me when it wasn’t a sure bet at all that that was going to be a good return on their time. But like I’ve had mentors and people who just sort of looked at me, and, for reasons I don’t fully understand, have just been super generous with their time and their wisdom. And angel investing is less about an investment strategy and more about me wanting to encourage that next generation of entrepreneurs to go out and make something, and then trying to help them in whatever way that I can be successful and find the joy that there is in bringing completely new things into the world that are you know sort of non-obvious and -complicated.

Host: Mmmm. Speaking of complicated. One common theme I hear from tech researchers here on this podcast, at least the ones who have been around a while, is that things aren’t as easy as they used to be. They’re much more complex. And in fact, a person you just talked to, Anders Hejlsberg, recently said, “Code is getting bigger and bigger, but our brains are not getting bigger, and this is largely a brain exercise.”

Kevin Scott: Yes.

Host: So, you’ve been around a while. Talk about the increased complexity you’ve seen and how that’s impacted the lives and work of computer scientists and researchers all around.

Kevin Scott: I think interestingly enough, on the one hand, it is far more complicated now than it was, say, 25 years ago. But there’s a flipside to that where we also have a situation where individual engineers or small teams have unprecedented amounts of power in the sense that, through open-source software and cloud computing and the sophistication of the tools that they now use and the very high level of the abstractions that they have access to that they use to build systems and products, they can just do incredible things with far fewer resources and in far shorter spans of time than has ever been possible. It’s almost this balancing act. Like, on the other hand, it’s like, oh my god, the technology ecosystem, the amount of stuff that you have to understand if you are pushing on the state-of-the-art on one particular dimension, which is what we’re calling upon researchers to do all the time, it’s really just sort of a staggering amount of stuff. I think about how much reading I had to do when I was a PhD student, which seemed like a lot at the time. And I just sort of look at the volume of research that’s being produced in each individual field right now. The reading burden for PhD students right now must be unbelievable. And it’s sort of similar, you know, like, if you’re a beginning software engineer, like it’s a lot of stuff. So, it’s this weird dichotomy. I think it’s, perhaps if anything, the right trade off. Because if you want to go make something and you’re comfortable navigating this complexity, the tools that you have are just incredibly good. I could have done the engineering work at my first startup with far, far, far fewer resources, with less money, in a shorter amount of time, if I were building it now versus 2007. But I think that that tension that you have as a researcher or an engineer, like this dissatisfaction that you have with complexity and this impulse to simplicity, it’s exactly the right thing, because if you look at any scientific field, this is just how you make progress.

Host: Listen, I was just thinking, when I was in my master’s degree, I had to take a statistics class. And the guy who taught it was ancient. And he was mad that we didn’t have to do the math because computer programs could already do it. And he’s not wrong. It’s like, what if your computer breaks? Can you do this?

Kevin Scott: That is fascinating, because we have this… old fart computer scientist engineers like me, have this… like we bemoan a similar sort of thing all the time, which is, ahhh, these kids these days, they don’t know what it was like to load their computer program into a machine from a punch paper tape.

Host: Right?

Kevin Scott: And they don’t know what ferrite core memories are, and what misery that we had to endure to… It was fascinating and fun to, you know, learn all of that stuff, and I think you did get something out of it. Like it gave you this certain resilience and sort of fearlessness against these abstraction boundaries. Like you know, if something breaks, like you feel like you can go all the way down to the very lowest level and solve the problem. But it’s not like you want to do that stuff. Like all of that’s a pain in the ass. You can do so much more now than you could then because, to use your statistic professor’s phrase, because you don’t have to do all of the math.

(music plays)

Host: Your career in technology spans the spectrum including both academic research and engineering and leadership in industry. So, talk about the value of having experience in both spheres as it relates to your role now.

Kevin Scott: You know, the interesting thing about the research that I did is, I don’t know that it ever had a huge impact. The biggest thing that I ever did was this work on dynamic binary translation and the thing I’m proudest of is like I wrote a bunch of software that people still use, you know, to this day, to do research in this very arcane, dark alley of computer science. But what I do use all the time that is almost like a superpower that I think you get from being a researcher is being able to very quickly read and synthesize a bunch of super-complicated technical information. I believe it’s less about IQ and it’s more of the skill that you learn when you’re a graduate student trying to get yourself ramped up to mastery in a particular area. It’s just like, read, read, read, read, read. You know, I grew up in this relatively economically depressed part of rural, central Virginia, town of 250 people, neither of my parents went to college. We were poor when I grew up and no one around me was into computers. And like somehow or another, I got into this science and technology high school when I was a senior. And like I decided that I really, really, really wanted to be a computer science professor after that first year. And so, I went into my undergraduate program with this goal in mind. And so, I would sit down with things like the Journal of the ACM at the library, and convince, oh, like obviously computer science professors need to be able to read and understand this. And I would stare at papers in JACM, and I’m like, oh my god, I’m never, ever going to be good enough. This is impossible. But I just kept at it. And you know it got easier by the time that I was finishing my undergraduate degree. And by the time I was in my PhD program, I was very comfortably blasting through stacks of papers on a weekly basis. And then, you know, towards the end of my PhD program, you’re on the program committees for these things, and like not only are you blasting through stacks of papers, but you’re able to blast through things and understand them well enough that you can provide useful feedback for people who have submitted these things for publication. That is an awesome, awesome, like, super-valuable skill to have when you’re an engineering manager, or if you’re a CTO, or you’re anybody who’s like trying to think about where the future of technology is going. So, like every person who is working on their PhD or their master’s degree right now and like this is part of their training, don’t bemoan that you’re having to do it. You’re doing the computer science equivalent of learning how to play that Liszt piano concerto. You’re getting your 10,000 hours in, and like it’s going to be a great thing to have in your arsenal.

Host: Anymore, especially in a digitally-distracted age, being able to pay attention to dense academic papers and/or, you know, anything for a long period of time is a superpower!

Kevin Scott: It is. It really is. You aren’t going to accomplish anything great by you know integrating information in these little 2-minute chunks. I think pushing against the state-of-the-art, like you know creating something new, making something really valuable, requires an intense amount of concentration over long periods of time.

Host: So, you came to Microsoft after working at a few other companies, AdMob, Google, LinkedIn. Given your line of sight into the work that both Microsoft and other tech giants are doing, what kind of perspective do you have on Microsoft’s direction, both on the product and research side, and specifically in terms of strategy and the big bets that this company is making?

Kevin Scott: I think the big tech companies, in particular, are in this really interesting position, because you have both the opportunity and the responsibility to really push the frontier forward. The opportunity, in the sense that you already have a huge amount of scale to build on top of, and the responsibility that knowing that some of the new technologies are just going to require large amounts of resources and sort of patience. You know like one example that we’re working on here at Microsoft is we, the industry, have been worried about the end of Moore’s Law for a very long time now. And it looks like for sort of general purpose flavors of compute, we are pretty close to the wall right now. And so, there are two things that we’re doing at Microsoft right now that are trying to mitigate part of that. So, like one is quantum computing, which is a completely new away to try to build a computer and to write software. And we’ve made a ton of progress over the past several years. And our particular approach to building a quantum computer is really exciting, and it’s like this beautiful collaboration between mathematicians and physicists and quantum information theory folks and systems and programming language folks trained in computer science. But when, exactly, this is going to be like a commercially viable technology? I don’t know. But another thing that we’re you know pushing on, related to this Moore’s wall barrier, is doing machine learning where you’ve got large data sets that you’re fitting models to where you know sort of the underlying optimization algorithms that you’re using for DNNs or like all the way back to more prosaic things like logistic regression, boil down to like a bunch of sort of linear algebra. We are increasingly finding ways to solve these optimization problems in these embarrassingly parallel ways where you can use like special flavors of compute. And so like there’s just a bunch of super interesting work that everybody’s doing with this stuff right now, like, from Doug Burger’s Project Brainwave stuff here at Microsoft to… uh, so it’s a super exciting time I think to be a computer architect again where the magnitude and the potential payoffs of some of these problems are just like astronomically high, and like it takes me back to like the 80s and 90s, you know which were sort of the, maybe the halcyon days of high-performance computing and these like big monolithic supercomputers that we were building at the time. It feels a lot like that right now, where there’s just this palpable excitement about the progress that we’re making. Funny enough, I was having breakfast this morning with a friend of mine, and you know like both of us were saying, man, this is just a fantastic time in computing. You know, like on almost weekly basis, I encounter something where I’m like, man, this would be so fun to go do a PhD on.

Host: Yeah. And that’s a funny sentence right there.

Kevin Scott: Yeah, it’s a funny sentence. Yeah.

(music plays)

Host: Aside from your day job, you’re doing some interesting work in the non-profit space, particularly with an organization called Behind the Tech. Tell our listeners about that. What do you want to accomplish? What inspired you to go that direction?

Kevin Scott: Yeah, a couple of years ago, I was just looking around at all of the people that I work with who were doing truly amazing things, and I started thinking about how important role models are for both kids, who were trying to imagine a future for themselves, as well as professionals, like people who are already in the discipline who are trying to imagine what their next step ought to be. And it’s always nice to be able to put yourself in the shoes of someone you admire, and say, like, “Oh, I can imagine doing this. I can see myself in this you know in this career.” And I was like we just do a poorer job I think than we should on showing the faces and telling the stories of the people who have made these major contributions to the technology that powers our lives. And so that was sort of the impetus with behindthetech.org. So, I’m an amateur photographer. I started doing these portrait sessions with the people I know in computing who I knew had done impressive things. And then I hired someone to help you know sort of interview them and write a slice of their story so that you know if you wanted to go somewhere and get inspired about you know people who were making tech, you know, behindthetech.org is the place for you.

Host: So, you also have a brand-new podcast, yourself, called Behind the Tech. And you say that you look at the tech heroes who’ve made our modern world possible. I’ve only heard one, and I was super impressed. It’s really good. I encourage our listeners to go find Behind the Tech podcast. Tell us why a podcast on these tech heroes that are unsung, perhaps.

Kevin Scott: I have this impulse in general to try to celebrate the engineer. I’m just so fascinated with the work that people are doing or have done. Like, the first episode is with Anders Hejlsberg, who is a tech fellow at Microsoft, and who’s been building programing languages and development tools for his entire 35-year career. Earlier in his career, like, he wrote this programming language and compiler called Turbo Pascal. You know like I wrote my first real programs using the tools that Anders built. And like he’s gone on from Turbo Pascal to building Delphi, which was one of the first really nice integrated development environments for graphical user interfaces, and then at Microsoft, he was like the chief architect of the C# programming language. And like now, he’s building this programming language based on JavaScript called TypeScript that tries to solve some of the development-at-scale problems that JavaScript has. And that, to me, is like just fascinating. How did he start on this journey? Like, how has he been able to build these tools that so many people love? What drives him? Like I’m just intensely curious about that. And I just want to help share their story with the rest of the world.

Host: Do you have other guests that you’ve already recorded with or other guests lined up?

Kevin Scott: Yeah, we’ve got Alice Steinglass, who is the president of Code.org, who is doing really brilliantly things trying to help K-12 students learn computer science. And we’re going to talk with Andrew Ng in a few weeks, who is one of the titans of deep neural networks, machine learning and AI. We’re going to talk with Judy Estrin, who is former CTO of Cisco, a serial entrepreneur, board director at Disney and FedEx for a long time. And just you know one of the OGs of Silicon Valley. Yeah, so it’s you know like, it’s going to be a really good mix of folks.

Host: Yeah, well, it’s impressive.

Kevin Scott: All with fascinating stories.

Host: Yeah, and just having listened to the first one, I was – I mean, it was pretty geeky. I will be honest. There’s a lot of – it was like listening to the mechanics talking about car engines, and I know nothing, but it was…

Kevin Scott: Yeah, right?

Host: But it was fun.

Kevin Scott: That’s great. And like you know I hadn’t even thought about it before. But like if could be like the sort of computer science and engineering version of Car Talk, that would be awesome.

Host: You won first place at the William Campbell High School Talent Show in 1982 by appearing as a hologram downloaded from the future. Okay, maybe not for real. But an animated version of you did explain the idea of the Intelligent Edge to a group of animated high school hecklers. Assuming you won’t get heckled by our podcast audience, tell us how you feel like AI and machine learning research are informing and enabling the development of edge computing.

Kevin Scott: You know I think this is one of the more interesting emergent trends right now in computing. So, there are basically three things that are coming together at the same time. You know one thing is the growth of IoT, and just embedded computing in general. You can look at any number of estimates of where we’re likely to be, but we’re going to go from about 11 or 12 billion devices connected to the internet to about 20 billion over the next year and a half. But you think about these connected devices – and this is sort of the second trend – like they all are becoming much, much more capable. So, like, they’re coming online and like the silicon and compute power available in all of these devices is just growing at a very fast clip. And going back to this whole Moore’s Law thing that we were talking about, if you look at $2 and $3 microprocessor and microcontrollers, most of those things right now are built on two or three generations older process technologies. So, they are going to increase in power significantly over the coming years, like particularly this flavor of power that you need to run AI models, which is sort of the third trend. So, like you’ve got a huge number of devices being connected with more and more computer power and like the compute power is going to enable more and more intelligent software to be written using the sensor data that these devices are processing. And so like those three things together we’re calling the intelligent edge. And we’re entering this world where you’ll step into a room and like there are going to be dozens and dozens of computing devices in the room, and you’ll interface with them by voice and gesture and like a bunch of other sort of intangible factors where you won’t even be aware of them anymore. And so that implies a huge set of changes in the way that we write software. Like how do you build a user experience for these things? How do you deal with information security and data privacy in these environments? Just even programming these things is going to be fundamentally different. It’s a super exciting time. And it’s certainly something that we are investing very heavily in right now at Microsoft, in the particular sense of like, how do we take the best of our development tools, the best of our platform technology, the best of our AI, and the best of our cloud, to let people build these solutions where it’s not as hard as it is right now?

Host: Well, you know, everything you’ve said leads me into the question that I wanted to circle back on from the beginning of the interview, which is that the current focus on AI, machine learning, cloud computing, all of the things that are just like the hot core of Microsoft Research’s center – they have amazing potential to both benefit our society and also change the way we interact with things. Is there anything about what you’re seeing and what you’ve been describing that keeps you up at night? I mean, without putting too dark a cloud on it, what are your thoughts on that?

Kevin Scott: The number one thing is, I’m worried that we are actually underappreciating the positive benefit that some of these technologies can have, and are not investing as much as we could be, holistically, to make sure that they get into the hands of consumers in a way that benefits society more quickly. And so like just to give you an example of what I mean, we have healthcare costs right now that are growing faster than our gross domestic product. And I think the only way, in the limit, that you bend the shape of that healthcare cost growth curve, is through the intervention of some sort of technology. And like, week after week over the past 18 months, I’ve seen one technology after another that is AI-based where you sort of combine medical data or personal sensor data with this new regime of deep neural networks, and you’re able to solve these medical diagnostic problems at unbelievably low costs that are able to very early detect fairly serious conditions that people have when the conditions are cheaper and easier to treat and where you know the benefit to the patient, like they’re healthier in the limit. And so, I sort of see technology after technology in this vein that is really going to bring higher-quality medical care to everyone for cheaper and help us get ahead of these, you know sort of, significant diseases that folks have. And you know, there’s a similar trend in precision agriculture where, in terms of crop yields and minimizing environmental impacts, particularly in the developing world where you still have large portions of the world’s population sort of trapped in this you know sort of agricultural subsistence dynamic, AI could fundamentally change you know the way that we’re all living our lives, all the way from you know like all of us getting like you know sort of cheaper, better, locally-grown organic produce with smaller environmental impact, to you know like how does a subsistence farmer in India dramatically increase their crop yield so that they can elevate the economic status of their entire family and community?

Host: So, as we wrap up, Kevin, what advice would you give to emerging researchers or budding technologists in our audience, as many of them are contemplating what they’re going to do next?

Kevin Scott: Well, I think congratulations is in order to most folks, because this is like just about as good a time I think as has ever been for someone to pursue a career in computer science research, or to become an engineer. I mean, the advice that I would give to folks is like, just look for ways to maximize the impact of what you’re doing and so like I think with research, it’s sort of the same advice that I would give to folks starting a company, or engineers thinking about the next thing that they should go off and build in the context of a company: find a trend that is really a fast growth driver, like the amount of available AI training compute, or the amount of data being produced by the world in general, or by some particular you know subcomponent of our digital world. Just pick a growth driver like that and try to you know attempt something that is either buoyed by that growth driver or that is directly in the growth loop. Because I think those are the opportunities that tend to have both the most head room in terms of you know like if there are lots of people working on a particular problem, it’s great if the space that you’re working in, the problem itself, has a gigantic potential upside. Those things will usually like accommodate lots and lots and lots of sort of simultaneously activity on them and not be a winner-takes-all or a winner-takes-most dynamic. You know and there are also sort of the interesting problems as well. You know it’s sort of thrilling to be on a rocket ship in general.

Host: Kevin Scott. Thanks for taking time out of your super busy life to chat with us.

Kevin Scott: You are very welcome. Thank you so much for having me on. It was a pleasure.

Host: To learn more about Kevin Scott, and Microsoft’s vision for the future of computing, visit microsoft.com/research.

Will PowerShell Core 6 fill in missing features?

Administrators who have embraced PowerShell to automate tasks and manage systems will need to prepare themselves…

“;
}
});

/**
* remove unnecessary class from ul
*/
$(“#inlineregform”).find( “ul” ).removeClass(“default-list”);

/**
* Replace “errorMessageInput” class with “sign-up-error-msg” class
*/
function renameErrorMsgClass() {
$(“.errorMessageInput”).each(function() {
if ($(this).hasClass(“hidden”)) {
$(this).removeClass(“errorMessageInput hidden”).addClass(“sign-up-error-msg hidden”);
} else {
$(this).removeClass(“errorMessageInput”).addClass(“sign-up-error-msg”);
}
});
}

/**
* when validation function is called, replace “errorMessageInput” with “sign-up-error-msg”
* before return
*/
function validateThis(v, form) {
var validateReturn = urValidation.validate(v, form);
renameErrorMsgClass();
return validateReturn;
}

/**
* DoC pop-up window js – included in moScripts.js which is not included in responsive page
*/
$(“#inlineRegistration”).on(“click”,”a.consentWindow”, function(e) {
window.open(this.href, “Consent”, “width=500,height=600,scrollbars=1”);
e.preventDefault();
});

as Microsoft plans to focus its energies in the open source version called PowerShell Core.

All signs from Microsoft indicate it is heading away from the Windows-only version of PowerShell, which the company said it will continue to support with critical fixes — but no further upgrades. The company plans to release PowerShell Core 6 shortly. Here’s what admins need to know about the transition.

What’s different with PowerShell Core?

PowerShell Core 6 is an open source configuration management and automation tool from Microsoft. As of this article’s publication, Microsoft made a release candidate available in November. PowerShell Core 6 represents a significant change for administrators because it shifts from a Windows-only platform to accommodate heterogeneous IT shops and hybrid cloud networks. Microsoft’s intention is to give administrative teams a single tool to manage Linux, macOS and Windows systems.

What features are not in PowerShell Core?

PowerShell Core runs on .NET Core and uses .NET Standard 2.0, the latter is a common library that helps make some current Windows PowerShell modules work in PowerShell Core.

As a subset of the .NET Framework, PowerShell Core misses out on some useful features in Windows PowerShell. For example, workflow enables admins to execute tasks or retrieve data through a sequence of automated steps. This feature is not in PowerShell Core 6. Similarly, tasks such as sequencing, checkpointing, resumability and persistence are not available in PowerShell Core.

A few other features missing from PowerShell Core 6 are:

  • Windows Presentation Foundation: This is the group of .NET libraries that enable coders to build UIs for scripts. It offers a common platform for developers and designers to work together with standard tools to create Windows and web interfaces.
  • Windows Forms: In PowerShell 5.0 for Windows, the Windows Forms feature provides a robust platform to build rich client apps with the GUI class library on the .NET Framework. To create a form, the admin loads the System.Windows.Forms assembly, creates a new object of type system.windows.forms and calls the ShowDialog method. With PowerShell Core 6, administrators lose this capability.
  • Cmdlets: As of publication, most cmdlets in Windows PowerShell have not been ported to PowerShell Core 6. However, the compatibility with .NET assemblies enables admins to use the existing modules. Users on Linux are limited to modules mostly related to security, management and utility. Admins on that platform can use the PowerShellGet in-box module to install, update and discover PowerShell modules. PowerShell Web Access is not available for non-Windows systems because it requires Internet Information Services, the Windows-based web server functionality.
  • PowerShell remoting: Microsoft ports Secure Socket Shell to Windows, and SSH is already popular in other environments. That means SSH-based remoting for PowerShell is likely the best option for remoting tasks. Modules such as Hyper-V, Storage, NetTCPIP and DnsClient have not been ported to PowerShell Core 6, but Microsoft plans to add them.

Is there a new scripting environment?

For Windows administrators, the PowerShell Integrated Scripting Environment (ISE) is a handy editor that admins use to write, test and debug commands to manage networks. But PowerShell ISE is not included in PowerShell Core 6, so administrators must move to a different integrated development environment.

Microsoft recommends admins use Visual Studio Code (VS Code). VS Code is a cross-platform tool and uses web technologies to provide a rich editing experience across many languages. However, VS Code lacks some of PowerShell ISE’s features, such as PSEdit and remote tabs. PSEdit enables admins to edit files on remote systems without leaving the development environment. Despite VS Code’s limitations, Windows admins should plan to migrate from PowerShell ISE and familiarize themselves with VS Code.

What about Desired State Configuration?

Microsoft offers two versions of Desired State Configuration: Windows PowerShell DSC and DSC for Linux. DSC helps administrators maintain control over software deployments and servers to avoid configuration drift.

Microsoft plans to combine these two options into a single cross-platform version called DSC Core, which will require PowerShell Core and .NET Core. DSC Core is not dependent on Windows Management Framework (WMF) and Windows Management Instrumentation (WMI) and is compatible with Windows PowerShell DSC. It supports resources written in Python, C and C++.

Debugging in DSC has always been troublesome, and ISE eased that process. But with Microsoft phasing out ISE, what should admins do now? A Microsoft blog says the company uses VS Code internally for DSC resource development and plans to release instructional videos that explain how to use the PowerShell extension for DSC resource development.

PowerShell Core 6 is still in its infancy, but Microsoft’s moves show the company will forge ahead with its plan to replace Windows PowerShell. This change brings a significant overhaul to the PowerShell landscape, and IT admins who depend on this automation tool should pay close attention to news related to its development.

Dig Deeper on Microsoft Windows Scripting Language

Industrial IoT adoption rates high, but deployment maturity low

Industrial organizations have embraced IoT, that much is clear. But a new study found current deployments aren’t very advanced yet — though that will come in time, the organizations said.

Bsquare’s 2017 Annual IIoT Maturity Study surveyed more than 300 senior-level employees with operational responsibilities from manufacturing, transportation, and oil and gas companies with annual revenues of more than $250 million.

Eighty-six percent of respondents said they have IIoT technologies in place, with an additional 12% planning to deploy IIoT within the next year. And of the 86% of industrial organizations that have completed IoT adoption, 91% said the IIoT deployments were important to business operations.

However, while IoT is catching on, most industrial organizations are still in the early stages.

The state of IoT adoption in industrial organizations

The study outlined five levels of IoT adoption: device connectivity and data forwarding, real-time monitoring, data analytics, automation and on-board intelligence.

Seventy-eight percent of survey respondents, with transportation leading the pack, self-identified their companies at the first stage, transmitting sensor data to the cloud for analytics, and 56%, again with transportation in the lead, reached the second stage, monitoring sensor data in real time for visualization.

Dave McCarthy, BsquareDave McCarthy

Dave McCarthy, senior director of products at Bsquare, said he had predicted the gap between the first two stages would be smaller; no surprise there. What really surprised him, however, was the small gap between the second stage and third: Forty-eight percent of respondents said they were using data analytics for insight, predictions and optimization with applied analytics such as machine learning or artificial intelligence.

“What it indicates to me,” McCarthy said, “Is that people who have gone down the visualization route have figured out, to some degree, some use of the data they’re collecting, and they know that analytics is going to play a part in helping them understand more closely what that data is going to mean for them.”

McCarthy wasn’t surprised to see the drop in the fourth and fifth stages: Twenty-eight percent said they were automating actions across internal systems with their IoT deployments, and only 7% had reached the edge analytics level.

“Just as expected, there’s a large drop-off from people doing analytics to people who are automating the results,” McCarthy said. “And in my mind, the highest amount of ROI comes when you can get to those levels.”

IIoT Maturity Model
Maturity of IoT adoption in industrial organizations

IIoT adoption and satisfaction

Not reaching the highest levels of ROI isn’t deterring IoT adoption, though: Seventy-three percent of respondents said they expect to increase IIoT deployments over the next year, with higher IoT adoption rates in transportation and manufacturing (85% and 78%, respectively) than oil and gas (56%). Additionally, 35% of all industrial organizations believe they will reach the automation stage, and 29% are aiming to reach the real-time monitoring stage in the same time period.

Nor will ROI always be calculated the same by analysts and companies as it is by the organizations using IIoT technologies, McCarthy noted. Respondents cited machine health- (90%) and logistics- (67%) related goals as top IoT adoption drivers, while lowering operating costs came in at 24%.

“The number one motivation that all operations-oriented companies have is improving and increasing uptime of their equipment,” McCarthy said. “I hear this over and over again. They know they eventually have to do maintenance on equipment and take things down for repairs, but it is so much more manageable when they can get ahead of that and plan for it.”

“The reality for these types of businesses is that if there are plant shutdowns or line shutdowns that last for extended periods of time, they often don’t have the ability to make up that loss in production,” McCarthy added. “You can’t just run another shift on a Saturday to pick up the slack. Oftentimes the value of the product they’re producing far outweighs the cost of operating the equipment. What this indicates to me is, ‘I’ll spend more if that means I can keep that line running because of the production value.'”

With or without traditional ROI, the majority of survey respondents said they were happy with the results they’re seeing: Eighty-four percent said their products and services were extremely or very effective, with the transportation sector seeing a 96% satisfaction rate.

Additionally, 99% of oil and gas, 98% of transportation and 90% of manufacturing organizations said IIoT would have a significant impact on their industry at a global level. Perhaps those predictions of IIoT investments reaching $60 trillion in the next 15 years and the number of internet-connected IIoT devices exceeding 50 billion by 2020 will become a reality.

Cisco cloud VP calls out trends in multicloud strategy

Large enterprises have quickly embraced multicloud strategy as a common practice — a shift that introduces opportunities, as well as challenges.

Cisco has witnessed this firsthand, as the company seeks a niche in a shifting IT landscape. Earlier this year, Cisco shuttered Intercloud Services, its failed attempt to create a public cloud competitor to Amazon Web Services (AWS). Now, Cisco’s bets are on a multicloud strategy to draw from its networking and security pedigree and sell itself as a facilitator for corporate IT’s navigation across disparate cloud environments.

In an interview with SearchCloudComputing, Kip Compton, vice president of Cisco’s cloud platforms and solutions group, discussed the latest trends with multicloud strategy and where Cisco plans to fit in the market.

How are your customers shifting their view on multicloud strategy?

Kip Compton: It started with the idea that each application is going to be on separate clouds. It’s still limited to more advanced customers, but we’re seeing use cases where they’re spanning clouds with different parts of an application or subsystems, either for historical reasons or across multiple applications, or taking advantage of the specific capabilities in a cloud.

Hybrid cloud was initially billed as a way for applications to span private and public environments, but that was always more hype than reality. What are enterprises doing now to couple their various environments?

Compton: The way we approach hybrid cloud is as a use case where you have an on-prem data center and a public data center and the two work together. Multicloud, the definition we’ve used, is at least two clouds, one of which is a public cloud. In that way, hybrid is essentially a subset of multicloud for us.

Azure Stack is a bit of an outlier, but hybrid has changed a bit for most of our customers in terms of it not being tightly coupled. Now it is deployments where they have certain codes that run in both places, and the two things work together to deliver an application. They’re calling that hybrid, whereas in the early days, it was more about seamless environments and moving workloads between on prem and the public cloud based on load and time of day, and that seems to have faded.

What are the biggest impediments to a successful multicloud strategy?

Compton: Part of it is what types of problems do people talk about to Cisco, as opposed to other companies, so I acknowledge there may be some bias there. But there are four areas that are pretty reliable for us in customer conversations.

First is networking, not surprisingly, and they talk about how to connect from on prem to the cloud. How do they connect between clouds? How do they figure out how that integrates to their on-prem connectivity frameworks?

Then, there’s security. We see a lot of companies carry forward their security posture as they move workloads; so virtual versions of our firewalls and things like that, and wanting to align with how security works in the rest of their enterprise.

The third is analytics, particularly application performance analytics. If you move an app to a completely different environment, it’s not just about getting the functionality, it’s about being performant. And then, obviously, how do you monitor and manage it [on] an ongoing basis?

The trend we see is [customers] want to take advantage of the unique capabilities of each cloud, but they need some common framework, some capability that actually spans across these cloud providers, which includes their on-prem deployment.

Where do you draw the line on that commonality between environments?

Compton: In terms of abstraction, there was a time where a popular approach was — I’ll call it the Cloud Foundry or bring-your-own-PaaS [platform as a service] approach — to say, ‘OK, the way I’m going to have portability is I’m not going to write my application to use any of the cloud providers’ APIs. I’m not going to take advantage of anything special from AWS or Azure or anyone.’

That’s less popular because the cloud providers have been fairly successful at launching new features developers want to use. We think of it more like a microservices style or highly modular pattern, where, for my application to run, there’s a whole bunch of things I need: messaging queues, server load, database, networking, security plans. It’s less to abstract Amazon’s networking, and it’s more to provide a common networking capability that will run on Amazon.

You mentioned customers with workloads spanning multiple clouds. How are those being built?

Compton: What I referred to are customers that have an application, maybe with a number of different subsystems. They might have an on-prem database that’s a business-critical system. They might do machine learning in Google Cloud Platform with TensorFlow, and they might look to deliver an experience to their customers through Alexa, which means they need to run some portion of the application in Amazon. They’re not taking their database and sharding it across multiple clouds, but those three subsystems have to work together to deliver that experience that the customer perceives as a single application.

What newer public cloud services do you see getting traction with your customers?

Compton: A few months ago, people were reticent to use [cloud-native] services because portability was the most important thing — but now, ROI and speed matter, so they use those services across the board.

A few months ago, people were reticent to use [cloud-native] services because portability was the most important thing — but now, ROI and speed matter, so they use those services across the board.
Kip Comptonvice president, Cisco’s cloud platforms and solutions group

We see an explosion of interest in serverless. It seems to mirror the container phenomenon where everybody agrees containers will become central to cloud computing architectures. We’re reaching the same point on serverless, or function as a service, where people see that as a better way to create code for more efficient [use of] resources.

The other trend we see: a lot of times people use, for example, Salesforce’s PaaS because their data is there, so the consumption of services is driven by practical considerations. Or they’re in a given cloud using services because of how they interface with one of their business partners. So, as much as there are some cool new services, there are some fairly practical points that drive people’s selection, too.

Have you seen companies shift their in-house architectures to accommodate what they’re doing in the public cloud?

Compton: I see companies starting new applications in the cloud and not on prem. And what’s interesting is a lot of our customers continue to see on-prem growth. They have said, ‘We’re going to go cloud-first on our new applications,’ but the application they already have on prem continues to grow in resource needs.

We also see interest in applying the cloud techniques to the on-prem data center or private cloud. They’re moving away from some of the traditional technologies to make their data center work more like a cloud, partially so it’s easier to work between the two environments, but also because the cloud approach is more efficient and agile than some of the traditional approaches.

And there are companies that want to get out of running data centers. They don’t want to deal with the real estate, the power, the cooling, and they want to move everything they can into Amazon.

What lessons did Cisco learn from the now-shuttered Intercloud?

Compton: The idea was to build a global federated IaaS [infrastructure as a service] that, in theory, would compete with AWS. At that time, most in the industry thought that OpenStack would take over the world. It was viewed as a big threat to AWS.

Today, it’s hard to relate to that point of view — obviously, that didn’t happen. In many ways, cloud is about driving this brutal consistency, and by having global fabrics that are identical and consistent around the world, you can roll out new features and capabilities and scale better than if you have a federated model.

Where we are now in terms of multicloud and strategy going forward — to keep customers and partners and large web scale cloud providers wanting to either buy from us or partner with us — it’s solving some of these complex networking and security problems. Cisco has value in our ability to solve these problems [and] link to the enterprise infrastructures that are in place around the world … that’s the pivot we’ve gone through.

Trevor Jones is a senior news writer with SearchCloudComputing and SearchAWS. Contact him at tjones@techtarget.com.