Tag Archives: easy

What are the steps for an Exchange certificate renewal?

An expired Exchange certificate can bring your messaging platform to a halt, but it’s easy enough to check and replace the expired certificate.

When mail stops flowing, Outlook access breaks and the Exchange Management Console/Shell gives errors, then it might be time to see if an Exchange certificate renewal is in order.

Exchange adds a certificate by default with your protocols during its installation, including Simple Mail Transfer Protocol and Internet Information Services (IIS). Many companies do not allow access to Outlook on the web, so mail is only accessible internally. This limits the Exchange Server capabilities as Microsoft designed it to be accessible from anywhere on any device.

For companies that choose to limit Exchange’s functionality, the IT staff often opts to use the default certificate, which has a five-year life span. In five years, IT might forget about the Exchange certificate renewal until they receive countdown emails warning that it will expire. If nobody sees these emails and the certificate expires, then problems will start, as Exchange services that require a valid certificate might not work.

To check a certificate’s status, run the following PowerShell command:

Get-ExchangeCertificate | fl

Assign a new certificate for Exchange 2010

If Exchange breaks due to an expired certificate, then you might want to push for a quick fix by issuing a certificate to an internal certificate authority. This won’t work because the certificate authority will not sign the certificate.

If you start to panic as help desk tickets start to flood in, this is when trouble typically happens. You might try to adjust the settings in IIS, but this can break Exchange. However, the fix is simple.

Run the New-ExchangeCertificate command to initiate the Exchange certificate renewal process. This PowerShell cmdlet will create a new self-signed certificate for Exchange 2010. The command prompts you to replace the existing certificate. Click Yes to proceed.

Exchange certificate replacement
Execute the PowerShell New-ExchangeCertificate cmdlet to build a new self-signed certificate for Exchange 2010.

Next, assign the services from the old certificate to the new one and perform an IISReset from an elevated command prompt to get Exchange services running again.

Finally, ensure the bindings in IIS are set to use the new certificate.

Panasas storage roadmap includes route to software-defined

Panasas is easy to overlook in the scale-out NAS market. The company’s products don’t carry the name recognition of Dell EMC Isilon, NetApp NAS filers and IBM Spectrum Scale. But CEO Faye Pairman said her team is content to fly below the radar — for now — concentrating mostly on high-performance computing, or HPC.

The Panasas storage flagship is the ActiveStor hybrid array with the PanFS parallel file system. The modular architecture scales performance in a linear fashion, as additional capacity is added to the system. “The bigger our solution gets, the faster we go,” Pairman said.

Panasas founder Garth Gibson launched the object-based storage architecture in 2000. Gibson, a computer science professor at Carnegie Mellon University in Pittsburgh, was a a developer of RAID storage taxonomy. He serves as Panasas’ chief scientist.

Panasas has gone through many changes over the past several years, marked by varying degrees of success to broaden into mainstream commercial NAS. That was Pairman’s charter when she took over as CEO in 2010. Key executives left in a 2016 management shuffle, and while investors have provided $155 million to Panasas since its inception, the last reported funding was a $52.5 million venture round in 2013.

As a private company, Panasas does not disclose its revenue, but “we don’t have the freedom to hemorrhage cash,” Pairman said.

We caught up with Pairman recently to discuss Panasas’ growth strategy, which could include offering a software-only license option for PanFS. She also addressed how the vendor is moving to make its software portable and why Panasas isn’t jumping on the object-storage bandwagon.

Panasas storage initially aimed for the high end of the HPC market. You were hired to increase Panasas’ presence in the commercial enterprise space. How have you been executing on that strategy?

Faye Pairman: It required looking at our parallel file system and making it more commercially ready, with features added to improve stability and make it more usable and reliable. We’ve been on that track until very recently.

We have an awesome file system that is very targeted at the midrange commercial HPC market. We sell our product as a fully integrated appliance, so our next major objective — and we announced some of this already — is to disaggregate the file system from the hardware. The reason we did that is to take advantage of commodity hardware choices on the market.

Once the file system is what we call ‘portable,’ meaning you can run it on any hardware, there will be a lot of new opportunity for us. That’s what you’ll be hearing from us in the next six months.

Would Panasas storage benefit by introducing an object storage platform, even as an archive device?

Pairman: You know, this is a question we’ve struggled with over the years. Our customers would like us to service the whole market. [Object storage] would be a very different financial profile than the markets we serve. As a small company, right now, it’s not a focus for us.

We differentiate in terms of performance and scale. Normally, what you see in scale-out NAS is that the bigger it gets, the more sluggish it tends to be. We have linear scalability, so the bigger our solution gets, the faster we go.

That’s critically important to the segments we serve. It’s different from object storage, which is all about being simple and the ability to get bigger and bigger. And performance is not a consideration.

Which vendors do you commonly face off with in deals? 

Pairman: Our primary competitor is IBM Spectrum Scale, with a file system and approach that is probably the most similar to our own and a very clear target on commercial HPC. We also run into Isilon, which plays more to commercial — meaning high reads, high usability features, but [decreased] performance at scale.

And then, at the very high end, we see DataDirect Networks (DDN) with a Lustre file system for all-out performance, but very little consideration for usability and manageability.

The niche is in the niche. We target very specific markets and very specific workloads.
Faye PairmanCEO, Panasas

Which industry verticals are prominent users of Panasas storage architecture? Are you a niche within the niche of HPC?

Pairman: The niche is in the niche. We target very specific markets and very specific workloads. We serve all kinds of application environments, where we manage very large numbers of users and very large numbers of files.

Our target markets are manufacturing, which is a real sweet spot, as well as life sciences and media and entertainment. We also have a big practice in oil and gas exploration and all kinds of scientific applications, and even some manufacturing applications within the federal government.

Panasas storage is a hybrid system, and we manage a combination of disk and flash. With every use case, while we specialize in managing very large files, we also have the ability to manage the file size that a company does on flash.

What impact could DDN’s acquisition of open source Lustre exert on the scale-out sector, in general, and Panasas in particular?

Pairman: I think it’s a potential market-changer and might benefit us, which is why we’re keeping a close eye on where Lustre ends up. We don’t compete directly with Lustre, which is more at the high end.

Until now, Lustre always sat in pretty neutral hands. It was in a peaceful place with Intel and Seagate, but they both exited the Lustre business, and Lustre ended up in DDN’s hands. It remains to be seen what that portends. But there is a long list of vendors that depend on Lustre remaining neutral, and now it’s in the hands of the most aggressive competitor in that space.

What happens to Lustre is less relevant to us if it stays the same. If it falters, we think we have an opportunity to move into that space. It’s potentially a big shakeup that could benefit vendors like us who build a proprietary file system.

The nine roles you need on your data science research team

It’s easy to focus too much on building a data science research team loaded with Ph.D.s to do machine learning at the expense of developing other data science skills needed to compete in today’s data-driven, digital economy. While high-end, specialty data science skills for machine learning are important, they can also get in the way of a more pragmatic and useful adoption of data science. That’s the view of Cassie Kozyrkov, chief decision scientist at Google and a proponent of the democratization of data-based organizational decision-making.

To start, CIOs need to expand their thinking about the types of roles involved in implementing data science programs, Kozyrkov said at the recent Rev Data Science Leaders Summit in San Francisco.

For example, it’s important to think about data science research as a specialty role developed to provide intelligence for important business decisions. “If an answer involves one or more important decisions, then you need to bring in the data scientists,” said Kozyrkov, who designed Google’s analytics program and trained more than 15,000 Google employees in statistics, decision-making and machine learning.

But other tasks related to data analytics, like making informational charts, testing out various algorithms and making better decisions, are best handled by other data science team members with entirely different skill sets.

Data science roles: The nine must-haves

There are a variety of data science research roles for an organization to consider and certain characteristics best suited for each. Most enterprises already have correctly filled several of these data science positions, but most will also have people with the wrong skills or motivations in certain data science roles. This mismatch can slow things down or demotivate others throughout the enterprise, so it’s important for CIOs to carefully consider who staffs these roles to get the most from their data science research.

Here is Kozyrkov’s rundown of the essential data science roles and the part each plays in helping organizations make more intelligent business decisions.

Data engineers are people who have the skills and ability to get data required for analysis at scale.

Basic analysts could be anyone in the organization with a willingness to explore data and plot relationships using various tools. Kozyrkov suggested it may be hard for data scientists to cede some responsibility for basic analysis to others. But, in the long run, the value of data scientists will grow, as more people throughout the company are already doing basic analytics.

Expert analysts, on the other hand, should be able to search through data sets quickly. You don’t want to put a software engineer or very methodical person in this role, because they are too slow.

“The expert software engineer will do something beautiful, but won’t look at much of your data sets,” Kozyrkov said. You want someone who is sloppy and will run around your data. Caution is warranted in buffering expert analysts from software developers inclined to complain about sloppy — yet quickly produced — code.

Statisticians are the spoilsports who will explain how your latest theory does not hold up for 20 different reasons. These people can kill motivation and excitement. But they are also important for coming to conclusions safely for important decisions.

A machine learning engineer is not a researcher who builds algorithms. Instead, these AI-focused computer programmers excel at moving a lot of data sets through a variety of software packages to decide if the output looks promising. The best person for this job is not a perfectionist who would slow things down by looking for the best algorithm.

A good machine learning engineer, in Kozyrkov’s view, is someone who doesn’t know what they are doing and will try out everything quickly. “The perfectionist needs to have the perfection encouraged out of them,” she said.

Too many businesses are trying to staff the team with a bunch of Ph.D. researchers. These folks want to do research, not solve a business problem.
Cassie Kozyrkovchief decision scientist at Google

A data scientist is an expert who is well-trained in statistics and also good at machine learning. They tend to be expensive, so Kozyrkov recommended using them strategically.

A data science manager is a data scientist who wakes up one day and decides he or she wants to do something different to benefit the bottom line. These folks can connect the decision-making side of business with the data science of big data. “If you find one of these, grab them and never let them go,” Kozyrkov said.

A qualitative expert is a social scientist who can assess decision-making. This person is good at helping decision-makers set up a problem in a way that can be solved with data science. They tend to have better business management training than some of the other roles.

A data science researcher has the skills to craft customized data science and machine learning algorithms. Data science researchers should not be an early hire. “Too many businesses are trying to staff the team with a bunch of Ph.D. researchers. These folks want to do research, not solve a business problem,” Kozyrkov said. “This is a hire you only need in a few cases.”

Prioritize data science research projects

For CIOs looking to build their data science research team, develop a strategy for prioritizing and assigning data science projects. (See the aforementioned advice on hiring data science researchers.)

Decisions about what to prioritize should involve front-line business managers, who can decide what data science projects are worth pursuing.

In the long run, some of the most valuable skills lie in learning how to bridge the gap between business decision-makers and other roles. Doing this in a pragmatic way requires training in statistics, neuroscience, psychology, economic management, social sciences and machine learning, Kozyrkov said. 

Shin’s story: Using technology to break down the barriers of disability in Japan – Asia News Center

Shin’s journey hasn’t been an easy one, but thanks to his parents lobbying a local education board – which once suggested Shin go to a special needs school – he has always been studying at regular schools.

Since elementary school, he studied with the help of computer software, such as Microsoft Word and OneNote. He uses a small, special mouse to draw graphs.

“By using Windows’ on-screen keyboard and moving the mouse, I can use my PC for study and communicating with my friends,” he explained.

Since 2013, Microsoft has assisted his learning, including preparation for the tough university entrance exam, by providing IT tools, such as the on-screen keyboard and a cursor control system that uses eye movements.

Shin is now trialing a new eye tracking software that enables him to move the mouse cursor with his eyes

“I have faced lots of challenges like everyone else, but we often need help too,” Shin said. “I’m currently trialing the new eye tracking software that enables me to move the mouse cursor with my eyes. This is one more example of how technology will help people like me work more efficiently.”

“My dream is that one day these kinds of functions will not be listed under accessibility but will be an integral part of how we all work to make a better future,” he added.

In 2016, Shin successfully passed the entrance exam for Tokyo University after spending a year at a preparatory school together with other students who aimed to enter the country’s competitive universities.

Now as a university student, Shin continues to study on his electric stretcher with assistance and support from helpers and the school. Since April this year, he lives on his own with assistance when he needs to move.

The entrance exam for Tokyo University is one of Japan’s most competitive assessments. Before the exams, Shin submitted a request to the exam authority, the National Center for University Entrance Examinations, notifying them that his physical condition required more attention.

During the exam, Shin sat in a separate room with more time to take the paper, and was assigned an assistant to write down his answers. Shin was also allowed to use a computer, especially when an answer required a graph.

Shin’s favorite quote by Friedrich Nietzsche, the philosopher he admires, is “Man is something that shall be overcome.” The feisty student is often led by these words when reflecting his own physical disability.

Shin, now 21, studies Western Philosophy at Tokyo University

“I believe that we need a new inclusive philosophical framework because technology is now empowering people to become independent beyond any physical barriers,” he says.

Learning from those with disabilities to improve their opportunities

One of those working with people with disabilities, such as Shin, is Microsoft Japan employee Tomoko Ohshima.

Gathering their comments, requests and feedback, she passes those to the tech giant’s developers to create tools to help people with disabilities.

Ohshima was encouraged to take on this project by Microsoft Japan some ten years ago, inspired by her interactions with a colleague, a programmer who is blind. “Technology is so helpful for people!” she says.

Meanwhile, Japan’s entrance exam system is also improving to accommodate students with various disabilities. A consensus has been established to allow students with disabilities to use tools approved by the authorities, such as computers, and to extend the test time depending on each student’s condition. Ohshima’s commitment of the last ten years coincides with this improvement, and has allowed her to witness the transition.

Challenges still remain for students with disabilities. For example, having a computer read out exam questions is rarely permitted in Japan. Instead, a reader is assigned to read the questions aloud for the examinee. This does not always work well for the students –– some students might want to read important parts more slowly, and others might want to have questions read out repeatedly to better understand them.

One of the reasons computer reading has not been approved is because examiners need to create extra exam papers by digitalizing them. This may be avoidable with optical character recognition (OCR).

“We are willing to provide any useful help and technology to create a society in which anyone can have the opportunity to take the entrance exams and be judged fairly regardless of one’s physical condition,” says Ohshima.


To read more about Microsoft Philanthropies’ work to build future ready generations in Asia, click here.

Three capabilities all leaders of innovation possess

Being a leader is never an easy job, but the job is getting harder, according to Linda Hill. To be a leader today, traditional management is not enough, you also have to help foster company-wide innovation — or risk being left in the dust.

Hill, the Wallace Brett Donham professor of business administration at the Harvard Business School and co-author of the book Collective Genius: The Art and Practice of Leading Innovation, is in a good position to know that. She and her colleagues have spent the last decade observing leaders of innovation around the globe, trying to understand what makes them successful and searching for commonalities.

Speaking to an audience of IT leaders at the recent LiveWorx event in Boston, Mass., Hill said that even though there’s a lot of research on leadership and a lot of research on innovation, there’s actually very little research that looks at the connections between the two.

In her research on leaders of innovation — studying successful people at companies like Google, HCL Technologies, Volkswagen and Pixar, to name a few — Hill and her team certainly found differences in how these people went about their work, including cultural differences, organizational differences and varying leadership styles, she said. But they also found real commonalities in what these people did and why they did it.

Whether these leaders of innovation were working at an Islamic bank in Dubai, a social enterprise in east Africa or a luxury product brand in Korea — they all championed three types of creativity that became part of their organizations’ cultures and a key to their organizations’ capacity to innovate. Hill dubbed them “creative abrasion,” “creative agility” and “creative resolution.”

Hill detailed these three capabilities and encouraged IT leaders to find ways to incorporate them into their own organizational cultures to foster innovation.

linda hill, liveworx, innovation
Leadership expert Linda Hill speaks to IT leaders at the recent LiveWorx event in Boston, Mass.

Creative abrasion

The ability to generate a marketplace of ideas through discourse and debate.

“You rarely get innovation without diversity and conflict,” Hill said.

Organizations may do brainstorming sessions in which people can say whatever is on their mind without judgment, but it can’t all be sunshine, rainbows and pleasantries. You need some abrasion and pushback to not only refine ideas but also develop a robust pipeline of ideas, Hill said.

“What you see in these [innovation-forward] organizations is people know how to inquire and they know how to actively listen, but, guess what — they also know how to advocate for their point of view,” Hill said. One of the organizations she and her team looked at actually taught their employees how to advocate for their point of view to help push creative abrasion.

Good leaders of innovation also understand that one of their key roles is to make sure that they — and everyone else — hear the minority voice, Hill said. “That does not mean you do what that minority voice says, but if you don’t know what it is, then you haven’t been doing things properly.”

Creative agility

The ability to test and refine ideas through quick pursuit, reflection and adjustment.

In order to refine your pipeline of ideas even more, Hill said you need to go through the process of actually testing it, getting feedback and making the necessary adjustments — and in a timely manner. Hill finds that many companies put in place lean startup or design thinking approaches to help organizations become better at being agile.

One of the organizations Hill observed decided not to run pilots anymore because if you run a pilot and it doesn’t work, someone or something was “wrong.” Instead, they run experiments.

“When you run an experiment, you learn something one way or the other and you move on to the next one,” Hill said. “But if you do a pilot and it doesn’t work, then usually there are politics around that. People often ignore the feedback they’re getting or somebody pays the price because it ‘failed.'”

Creating a culture that makes people feel comfortable running experiments and putting themselves out there without fear of retribution is crucial.

“So many people report feeling that they are, in fact, punished when they speak out, fail or have a misstep,” Hill said. “If that is the case, there is not enough psychological safety in that environment for you to unleash the kinds of conversations necessary to hone your ideas.”

Creative resolution

The ability to make integrative decisions.

Most innovations are really a combination of ideas; very rarely is the innovation all new, Hill said. It could be a new idea and an old idea combined to solve an old problem, or two old ideas that together solve a new problem, or some other amalgam of new and old.

“Unless you do decision-making in a way that you can actually combine ideas, you rarely get the innovative solution,” Hill said.

As a result, what Hill sees in these innovation-forward organizations is they’re very clear about who has decision-making rights, but they still do it in a more “inclusive and patient” way. By that she means they won’t allow one group to dominate. They won’t let the experts dominate — something she notes Steve Jobs was particularly worried about because he often felt that this group was the least likely to want to see change because then their expertise wouldn’t be as valuable as it was before. And these organizations don’t let the bosses dominate either.

“They will also not compromise, which is what we often do in these situations — go along to get along,” Hill said. “Instead, they will have, if you will, a fight. They will actually go through the creative abrasion process again and they will design the next experiment to get more data in order to move forward.”

Quality, meaningful future work hinges on human-machine ‘complementarity’

Massage therapists, breathe easy. Robots won’t steal your job. They don’t have the empathy, manual dexterity or ability to ask the right questions. Radiologists, take notice. A tool wired with machine learning algorithms can easily interpret medical images using computer-aided diagnosis systems.

But there are 26 other tasks radiologists do that machines can’t do well, such as administering conscious sedation during procedures or developing treatment plans, said Erik Brynjolfsson, director of the MIT Initiative on the Digital Economy. He was sharing results of a study on tasks in various jobs that can be done well by machines, at the MIT Sloan CIO Symposium, in Cambridge, Mass.

“Machine learning was able to do some tasks but not others within a given occupation,” Brynjolfsson said to the audience of senior IT executives in May; he moderated a panel discussion on future work in a world where AI and machine learning are the norm. “And that means that most of the jobs in your organizations will be partly affected by machine learning, but there will also be things that the humans need to continue to do. We’ll have to have partnerships of humans and machines.”

As AI and machine learning advance and begin to match and even outdo humans in fundamental skills such as recognizing images, Brynjolfsson and other academics on the panel agreed, business processes need to be re-engineered and tasks reallocated among people and machines so that technology amplifies, not substitutes, human potential.

Elisabeth Reynolds, Erik Brynjolfsson, Jason Jackson and Iyad Rahwan, all of MIT.
Elisabeth Reynolds, of MIT, speaks during a panel discussion on future work in an AI world at the MIT Sloan CIO Symposium in May. From left are moderator Erik Brynjolfsson, Jason Jackson and Iyad Rahwan, all of MIT, and Reynolds.

I, robot; you, human: We’re in this together

Panelist Elisabeth Reynolds, executive director of the MIT Industrial Performance Center, called the idea complementarity. A member of the MIT Task Force on the Work of the Future, convened to study the impact of digital innovations on the nature of work, Reynolds spoke about a narrative that has been shaping in the public sphere of robots becoming so advanced they steal people’s jobs and spur widespread hardship and despair.

We’ll have to have partnerships of humans and machines.
Erik Brynjolfssondirector, MIT Initiative on the Digital Economy

In the media, “technology is this thing that is happening to us, and we’re all trying to figure out how we’re going to survive,” she said. “We’re really interested in trying to change that narrative to one in which we understand technology as a tool.”

Reynolds cited a recent McKinsey & Co. report that estimated that robotics and AI would fully replace people in less than 5% of occupations by 2030. The technology would instead enhance or change 60% of jobs.

One example is from the manufacturing industry. Cobots, or collaborative robots, are being put to use alongside humans at companies such as FedEx, as reported in March by The New York Times, replacing routine work like lugging heavy items around a factory floor, and allowing workers to do other things, she said.

While automation will put some people out of work — “and we do have to deal with the displacement,” Reynolds said — the need for human workers will not go away.

Future work on the shop floor

Amazon’s huge distribution centers are another testing ground for human-machine collaboration. In the past, when someone bought an item from Amazon’s online marketplace, a worker called a picker would have to walk to its whereabouts on a shelf somewhere and back again to fill the order. Today, the facilities are buzzing with breadbox-shaped robots on wheels that identify the right shelves stocked with goods, lift them up and cart them to workers.

“They didn’t just do a one-for-one substitution of ‘Here’s a human; we’re going to replace them with a machine,'” Brynjolfsson said. “Instead they reinvented the whole process.”

That innovation has resulted in good and bad, Reynolds said. “People are not moving from A to B, long distances, etc.; they can stay where they are, and they can sort of pick and pack.”

The downside is the reformulated job of the picker isn’t ergonomic, she said — workers are standing in the same spot for eight-hour shifts, doing the same thing over and over. That’s difficult for people to do and potentially harmful. The challenge now is to design technology with humans in mind.

“We’ve solved one problem, but maybe we’ve created a few others,” Reynolds said. “We need to think about designing in the technology a way in which humans are actually advantaged and using all of the skills that they can bring to a job.”

A precarious future work?

Of course, Amazon’s robots are not the only AI-machine learning innovation with negative consequences for humans.

Take Uber. The ridesharing app’s platform uses machine learning algorithms to match riders and drivers. But Uber’s drivers, like all workers in the so-called gig economy, are contract workers, not full-time employees, and that means no fixed hours, no fixed pay, no employer-provided benefits. It’s a notion MIT’s Jason Jackson, assistant professor of political economy and urban planning, called precarity. It’s a form of the word precarious, and it refers to having insecure income and employment.

“Their worker status is now much more precarious both in terms of being able to be moved completely out of the firm, or the number of hours that they work becomes much more flexible,” Jackson said. From a managerial point of view, that flexibility can be very efficient, but for workers it means their cash flow becomes unstable.

“Most of us have fixed expenses — rent, mortgage — every month. So if your income becomes variable, then that creates a huge problem,” Jackson said.

Perpetual learning for survival

Conference attendee Theo Kornyoh is optimistic about AI and machine learning and future work. Kornyoh is CTO at Kaleida Health, a hospital network in western New York State. The technologies are transformative, making work and life much easier, he said. And they have different effects in different industries. In finance or retail, companies can look to AI tools to determine their customers’ needs, quickly address business concerns and expand.

In Kornyoh’s industry, healthcare, “it gives us a chance to be able to provide better services to our patients, also provide better services to our providers.”

As for whether technology will decimate entire job categories, like the panelists, Kornyoh doesn’t buy it.

“I think people just have to be able to be ready to learn and expand their knowledge base,” he said. “Because when you keep learning, I see automation and technology and ML and AI complement what you do rather than take your job away.”

Introduction to Azure Cloud Shell: Manage Azure from a Browser

Are you finding the GUI of Azure Portal difficult to work with?

You’re not alone and it’s very easy to get lost. There are so many changes and updates made every day and the azure overview blades can be pretty clunky to traverse through. However, with Azure Cloud Shell, we can utilize PowerShell or Bash to manage Azure resources instead of having to click around in the GUI.

So what is Azure Cloud Shell? It is a web-based shell that can be accessed via a web browser. It will automatically authenticate with your Azure sign-on credentials and allow you to manage all the Azure resources that your account has access to. This eliminates the need to load Azure modules on workstations. So for some situations where developers or IT Pros require shell access to their Azure resources, Azure Cloud Shell can be a very useful solution, as they won’t have to remote into “management” nodes that have the Azure PowerShell modules installed on them.

Cloud Masterclass webinar

How Azure Cloud Shell Works

As of right now, Azure Cloud Shell gives users two different environments to use. One is a Bash environment, which is basically a terminal connection to a Linux VM in Azure that gets spun up. This VM is free of charge. The second environment available is a PowerShell environment, which runs Windows PowerShell on a Windows Server Core VM. You will need to have some storage provisioned on your Azure account in order to create the $home directory. This acts as the persistent storage for the console session and allows users to upload scripts to run on the console.

Getting Started

To get started using Azure Cloud Shell, go to shell.azure.com. You will be prompted to sign in with your Azure account credentials:

Azure Cloud Shell welcome

Now we have some options. We can select which environment we prefer to run in. We can run in a Bash shell or we can use PowerShell. Pick whichever one you’re more comfortable with. For this example, I’ve selected PowerShell:

Next, we get a prompt for storage, since we haven’t configured the shell settings with this account yet. Simply select the “Create Now” button to go ahead and have Azure create a new resource group, or select “Show Advanced Settings” to configure those settings to your preference:

Once the storage is provisioned, we will wait a little bit for the console to finish loading, and then the shell should be ready for us to use!

In the upper left corner, we have all of the various controls for the console. We can reset the console, start a new session, switch to Bash, and upload files to our cloud drive:

For an example, I uploaded an activate.bat script file to my cloud drive. In order to access it we simply reference $home and specify our CloudDrive:

Now I can see my script:

This will allow you to deploy your custom PowerShell scripts and modules in Azure from any device! assuming you have access to a web browser, of course. Pretty neat!

Upcoming Changes and Things to Note

  • On May 21st, Microsoft announced that they will be going with Linux platform for both the Windows PowerShell and Bash experience. How is this possible? Essentially they will be using a Linux container to host the shell. By default PowerShell Core 6 will be the first experience. They claim that the startup time will be much faster than previous versions because of the Linux container. For switching between bash and PowerShell in the console, simply type “bash”. If you want to go back to PowerShell Core just type “pwsh”.
  • Microsoft is planning on having “persistent settings” for Git and SSH tools so that the settings for these tools are saved to the CloudDrive and users won’t have to hassle with them all the time.
  • There is some ongoing pain with modules currently. Microsoft is still working on porting over modules to .Net Core (for use with Powershell Core) and there will be a transition period while this happens. They are prioritizing the porting of the most commonly used modules first. In the meantime, there is one workaround that many people seem to forget, implicit remoting. This is the process of taking a module that is already installed on another endpoint and importing it into your PowerShell session allowing you to call that module and have it remotely execute on the node where the module is installed. It can be very useful for now until we get more modules converted over to .Net Core.

Want to Learn More About Microsoft Cloud Services?

The development pace of Azure is one of the most aggressive in the market today, and as you can see Azure Cloud Shell is constantly being updated and improved over a short period of time. In the near future, it will most likely be one of the more commonly used methods for interacting with Azure resources. It provides Azure customers with a seamless way of managing and automating their Azure resources without having to authenticate over and over again or install extra snap-ins and modules; and will continually shape the way we do IT today.

What are your thoughts regarding the Azure Cloud Shell? Have you used it yet? What are your initial thoughts? Let us know in the comments section below!

Do you have interest in more Azure Goodness? Are you wondering how to get started with the cloud and move some existing resources into Microsoft Azure? We actually have a panel styled webinar coming up in June that addresses those questions. Join Andy Syrewicze, Didier Van Hoye, and Thomas Maurer for a crash course on how you can plan your journey effectively and smoothly utilizing the exciting cloud technologies coming out of Microsoft including:

  • Windows Server 2019 and the Software-Defined Datacenter
  • New Management Experiences for Infrastructure with Windows Admin Center
  • Hosting an Enterprise Grade Cloud in your datacenter with Azure Stack
  • Taking your first steps into the public cloud with Azure IaaS

Journey to the Clouds

Save your seat

27″ iMac – Late 2009 – £250

Good condition Late 2009 27″ iMac – no scratches to the screen or casing
12GB of RAM
128GB SSD – Easy to upgrade to a bigger drive yourself if this isn’t enough
i5 2.66GHz
ATI Radeon 4850 512mb
Sierra OS
Apple Magic Mouse
Apple Wireless Keyboard

Only issue i’ve had with it is that if I let it sleep of it’s own accord it won’t wake up again. To combat this i’ve turned off auto sleep and when i’m finished using it I just press the power button once and sleep it manually, then it wakes fine….

27″ iMac – Late 2009 – £250

corsair carbide 400c case and dr delid tool for intel

dr delid tool used once to delid my 8600k,very easy to use £20 posted or £15 collected

corsair carbide 400c,its missing the psu plastic shroud cover and the ssd mount for the back(I use double sided tape anyways),apart from that in good condition. £35 collected from wakefield
really nice case.

Price and currency: corsair carbide 400c case and dr delid tool for intel
Delivery: Delivery cost is not included
Payment method: paypal or cash
Location: wakefield…

corsair carbide 400c case and dr delid tool for intel