Tag Archives: manager

Enhanced debugging and faster simulation with the latest Quantum Development Kit update

This post was authored with contributions by Cathy Palmer, Program Manager, Quantum Software & Services.

Today, Microsoft released an update to the Microsoft Quantum Development Kit including an enhanced debugging experience and faster simulations, as well as several contributions from the Q# community. We’re excited about the momentum generated by the many new Q# developers joining us in building a new generation of quantum computing.

Just over six months ago, we released a preview of Q#, our new programming language for quantum development featuring rich integration with Visual Studio. The February 26 release added integration with Visual Studio Code to support Q# development on macOS and Linux as well as Python interoperability for Windows. Since then, tens of thousands of developers have begun to explore Q# and the world of quantum development.

Today’s update includes significant performance improvements for simulations, regardless of the number of qubits required, as shown in the H2 simulation below. This is a standard sample included in the Microsoft Quantum Development Kit.

Simulation comparison

This update includes new debugging functionality within Visual Studio. The probability of measuring a “1” on a qubit is now automatically shown in the Visual Studio debugging window, making it easier to check the accuracy of your code. The release also improves the display of variable properties, enhancing the readability of the quantum state.

Screen showing enhanced debugging

Adding to the new debugging improvements, you’ll find two new functions that output probability information related to the target quantum machine at a specified point in time, called DumpMachine and DumpRegister. To learn more, you can review this additional information on debugging quantum programs.

Thanks to your community contributions, the Microsoft Quantum Development Kit now includes new helper functions and operations, plus new samples to improve the onboarding and debugging experience. Check out the release notes for a full list of contributions.

Download the latest Microsoft Quantum Development Kit

We’ve been thrilled with the participation, contributions, and inspiring work of the Q# community. We can’t wait to see what you do next.

How a first-time teacher brought new energy to education in rural Morocco |

Teaching wasn’t really on my to-do list. My ambition was to be a financial manager once I graduated from university, but instead I followed my father’s path into teaching. And in my country, Morocco, that means consigning yourself to an isolated region for the first few years of your career. No electricity, no drinkable water, and in winter you might have to cross rivers just to get to school.

Unlike many educators around the world, one of my challenges wasn’t to integrate technology into a modern urban classroom – it was to make it work in a rural environment, where students, their parents and their siblings have never so much as touched a PC or used the internet. But even in this situation, or maybe because of it, I started to change my mind about my career. I began to like my new job. Those innocent eyes waiting for me every morning pushed me into giving everything I have to improve education for children in rural places.

As a teacher and messenger of knowledge, situated in hard conditions, I had two choices: surrender to the reality, or choose the path of innovative educators. Click To Tweet

My classroom didn’t have electricity. The internet and mobile signals in the area were weak, and I had to walk a five-mile round trip, six days per week, over the mountains to get to the school. Still, I believed in the power of information and communication through technology, and I tried hard to surpass any technical or logistical problems, just to take my students to another climate of learning and bring my classroom to life. Where to start?

 

With most students here passing their time after school (and even at dawn) herding and guarding sheep, looking for water or helping their families at shelters, school just wasn’t the biggest priority. To figure out how to reduce absence, I needed to know more about it.

First, I used Microsoft Excel as a master tool to collect and analyze absence data, with clear definitions of when dropouts were happening. I asked for the absence data archive from the principal director and combined it with what I recorded every school day. From the results I concluded the highest rate of absence was on Fridays, which coincided with the most popular day for student to play, meet friends and step out of their routine life. It was all happening at the souk, an atmospheric and vibrant marketplace full of food and furniture, toys, candy, old comic books and other goods. In trying to think of something bigger, something more exciting and more attractive to get the students to their teacher, I decided to visit the souk myself and make a plan.

I bought a second laptop and additional batteries, so I wouldn’t have to worry about losing power in the class. It was a little hard at the beginning, carry two laptops in my bag for a 5-mile round trip to get to the school, but after some weeks I got used to it.

Each Friday, a raffle would be waiting for my students at the classroom. During recess, we’d organize a draw, and the winner would have the chance to use the laptop and choose between watching cartoons, playing an educational video games, or writing on Microsoft Word.

At the beginning, I thought my students would choose to play games or watch videos when they had their chance, but I was wrong. Most of them preferred to explore Word and they became so excited when they typed in their names and some words and paragraphs.

Giving my students the opportunity to use the PC and freely connect with technology had a powerful impact on combating the absence phenomenon. My students now prefer coming to school and they’re starting to convince their parents and siblings about the importance of school and ICT (Information and Communication Technologies). More recently, we’ve been holding a “Friday Surprise” each week, where students can express themselves and develop their skills by creating handmade decorations, using the laptop to look for creative ideas, to draw, or do other things that improve communication, collaboration, presentation, creativity, problem solving, and critical thinking.

There are some other educational issues we see in the multi-grade classroom. Some multi-grade teachers may teach two grades in the same class, while others may teach three or four grades. I’m teaching six grades. The students in these grades are usually of the same age but may differ in their abilities, which means:

  • Planning can be time consuming.
  • Teachers may be frustrated due to their geographical isolation.
  • Physical conditions may be unattractive. Some classrooms are very small and overcrowded.
  • Few materials are available for multi-grade teaching.

To take this challenge on, I thought about how being a teacher in a rural area didn’t prevent me from increasing my knowledge, or developing my professional and personal skills. I tried to use the internet to get away from the isolation and be a part of the community of innovative educators. After learning about new methods and experiences all over the planet, I decided to let my students choose, by themselves, to come to school, even on special days, rather than imposing it on them. With ICT, I would rather make them eager to build knowledge. I encouraged them to try new things and never be afraid of change. That why using ICT has had a positive impact not only in my classroom, but on the whole school environment.

For me, the weak infrastructure, the absence of digital tools and unawareness of how important education is are no excuse – we can still create and think of innovative ways to make our students love coming to school.

To meet the varied needs of multi-grade students, teachers need in-depth knowledge of child development and learning and a larger repertoire of instructional strategies than most single-grade teachers possess. They must be able to design open-ended, divergent learning experiences accessible to students functioning at different levels. They must know when and how to use homogeneous and heterogeneous grouping and how to design cooperative group tasks. They must be proficient in assessing, evaluating, and recording student progress using qualitative methods.

Multi-grade teachers must be able to facilitate positive group interaction and to teach social skills and independent learning skills to individual students. They must know how to plan and work cooperatively with colleagues, as team teaching is commonly combined with multi-grade organization. Finally, they must be able to explain multi-grade practices to parents and other community members, building understanding and support for their use.

The wealth of digital tools makes it easy to create your own educational materials, and there are many advantages in doing so. As a teacher, the learning for your students is strengthened by your voice and pedagogy. The students can study at their own pace and learn at their level. These are some of my strategies:

  • Consider students’ needs and their knowledge differentiation, by presenting my own lesson plan.
  • Make the explanation more attractive for my students.
  • Effectively manage the lesson’s time.
  • Develop game-based learning.
  • Improve real-world problem solving and collaboration

Microsoft technologies helped me perform my tasks more quickly and efficiently. Specifically:

  1. Planning: Microsoft offers planning templates that you can customize to your requirement. You can update and reuse these when you teach the lessons again.
  2. Record keeping: By maintaining electronic documents you can quickly access and update information, making it easier to share and cross reference.
  3. Assessing: With Microsoft Word, Excel and PowerPoint you can design assessments with automated marking.
  4. Coordinating and communicating: E-mail is a useful option to communicate. Microsoft Outlook offers the option of a shared calendar, which makes coordination efficient. You can use a blog or webpage that parents visit for updates.
  5. Collaborating: Shared workspaces or collaboration tools, such as SharePoint, Skype, Skype for Business, and Office 365 make it easier to collaborate on documents and hold virtual meetings.

For me, as a primary school teacher, my love for this noble job has grown far beyond what I ever expected. I have learned that the teacher doesn’t just light up minds, but hearts as well. I learned that teaching is art and love before it’s a job. I learned that education has no borders.

Top image: Bayla Khalid attending Education Exchange 2018 in Singapore, where he met educators from around the world.

To learn more about Microsoft Education and our tools and technology that help foster inclusion and support personalizing learning for every student, click here.

The December release of SQL Operations Studio is now available

This post is authored by Alan Yu, Program Manager, SQL Server.

We are excited to announce the December release of SQL Operations Studio is now available.

Download SQL Operations Studio and review the Release Notes to get started.

SQL Operations Studio is a data management tool that enables you to work with SQL Server, Azure SQL DB and SQL DW from Windows, macOS and Linux. To learn more, visit our GitHub.

SQL Operations Studio was announced for Public Preview on November 15th at Connect(), and this December release is the first major update since the announcement.

The December release includes several major repo updates and feature releases, including:

  • Migrating SQL Ops Studio Engineering to public GitHub repo
  • Azure Integration with Create Firewall Rule
  • Windows Setup and Linux DEB/RPM installation packages
  • Manage Dashboard visual layout editor
  • “Run Current Query with Actual Plan” command

For complete updates, refer to the Release Notes.

Migrating SQL Ops Studio Engineering to public GitHub repo

To provide better transparency with the SQL Operations Studio community, we have decided to migrate the Github internal branch to the public repo. This means any bug fixes, feature developments, or even test builds can be publicly viewed before an update is officially announced.

We made this move because we want to collaborate with the community to continually deliver features that our users want. This gives you the opportunity to see our changes in action to address your top voted issues. Visit our GitHub page and give us your feedback.

Azure Integration with Create Firewall Rule

Now let’s get into new features. A common issue when connecting to Azure SQL DB instances is that the connection can fail due to server firewall rules. This would require loading Azure Portal to configure firewall rules so that you can connect to your database, which can be inconvenient.

To speed up this process, we have enabled Azure Integration with Create Firewall Rule dialog. When your connection to an Azure SQL DB instance fails because of firewall settings, this dialog will appear, allowing the user to use their Azure subscription account to automatically configure the client IP address with the server. This retains the same experience as configuration on Azure Portal, except you can do it all through SQL Operations Studio.

Windows Setup installation and Linux DEB/RPM installation packages

We are always looking for new ways to improve the installation experience. With the December release, we have added Windows Setup installation to simplify installation on Windows. This wizard will allow the user to:

  • Select installation location
  • Select start menu folder
  • Option to add to path

In addition to Windows Setup, we have also added Linux DEB/RPM installation packages. These will add new ways for Linux users to download SQL Operations Studio for their choice of installation.

Feel free to try out these new installation experiences on our download page.

Manage Dashboard visual layout editor

In the initial release, there were not many options to customize the visual layout of the dashboards. With the December release, you can now resize and move your widgets by enabling the visual layout editor mode by clicking the pencil on the top right of the Manage Dashboard screen. This gives users greater control of their dashboard in addition to building their own custom insight widgets.

Run Current Query with Actual Plan command

Another new feature we have enabled is Run Current Query with Actual Plan, which is a command that will execute the current query and return the actual execution plan with the query results. This feature area is still in-progress as we work through the best UX for integrating this command directly into the query editor. While that design work is in-progress the functionality is still available via the Command Palette and you can define a keyboard shortcut if using this feature frequently.

Contact us

If you have any feature requests or issues, please submit to our GitHub issues page. For any questions, feel free to comment below or tweet us @sqlopsstudio.

How State Bank of India, a 215-year-old bank, hit refresh to become a modern workplace – Microsoft News Center India

It is a big day for Rachna Gupta, an Assistant Manager at State Bank of India (SBI). After dropping her 11-year-old daughter at school, she hurries to the Mayur Vihar Metro station for her daily hour-long commute to Chandni Chowk. Her thoughts are preoccupied with the upcoming presentation. “Will there be any last-minute hiccups?” she nervously wonders.

Her smartphone pings as she exits from the Chandni Chowk Metro station and hails a cycle rickshaw. It is an email from her team asking for changes in her presentation. Unlike earlier, when she’d have to wait to reach her desk to get any work done, Gupta opens the PowerPoint file from OneDrive on her phone. As the rickshaw snakes through the narrow lanes of the original walled city of old Delhi, she makes the changes and shares the new file with her team.

“For customers, banking has transformed completely. But technology has also made the life of employees’ smoother, and tension-free,” she says as she gets off at the 200-year-old heritage building that was once a palace belonging to Begum Samru. The building is a fitting location for SBI, which also traces its roots to pre-Independence India, with the formation of the Bank of Calcutta in 1802.

Gupta is one of the 263,000 employees at SBI, who are reaping the benefits of a modern workplace. This is the story of how one of the oldest banks in the world embarked upon a digital transformation journey for more than quarter-of-a-million employees, who serve over half-a-billion customer accounts.

The challenge

As with any organization with the scale and size of SBI, different technology solutions implemented at different times meant that most solutions were not talking to each other while some were archaic.

Emails were taking hours to get delivered and employees had to clear their inboxes frequently to ensure they had enough space for new emails to come. Documents were being shared as attachments with multiple versions getting created. There was no Global Address List and most employees could not access the official intranet network on their phones – they had to be on their desk, in front of their PC to do anything.

Things were even more difficult for senior employees, who had to travel to various branches for meetings as they could not remotely check-in. Teams in different branches, even in big cities, had no seamless way to connect with each other.

It was becoming clear to the senior leadership that for a behemoth like SBI, workplace transformation was essential to fulfill the service expected by its 500 million customer accounts, and to retain its leadership position in the super-competitive banking sector in India.

“It was vital to take digital transformation to our workforce – empowering them to become digitally enabled. We had to ‘Hit Refresh’,” says Arundhati Bhattacharya, who recently retired as SBI’s Chairperson and who had initiated the digital transformation at the bank.

The need of the hour

SBI required a solution that would address three key challenges. An integrated platform approach for all productivity requirements; simple to apply use-cases allowing for employees even in remote areas to be included; and an agile platform leveraging cloud that would give the bank the scale of operations required. Additionally, it was important to give employees a seamless experience across various devices like mobiles, tablets and PCs.

Microsoft’s modern cloud technology fitted perfectly with SBI’s vision of a contemporary workplace. Microsoft assessed the work environment and created role-based access profiles, including all employees from Chief Managers to Officers, clerical staff and other categories of employees, even those who have retired. Today, 263,000 SBI employees are on Office 365, using services like Exchange Online, OneDrive, Skype for Business, SharePoint Online, among others as a part of their daily work tools.

“The mobility solution from Microsoft helps us exercise continuous control over all the enabled devices. Our employees will now experience a modern digital workplace platform that will empower them to collaborate effectively from any device anywhere (Android, iOS, Mac and Windows), provide an integrated experience and reduce complexity,” says Mrityunjay Mahapatra, Deputy Managing Director & Chief Information Officer, SBI.

“We are excited about our partnership with Microsoft. As India’s economy continues to grow, the BFSI sector needs to be well-equipped to address the dynamic market pressures and evolving industry needs. It has become imperative to transform technologically to sustain a competitive edge. A digital culture shift, designing a modern workplace that harnesses digital intelligence and enabling mobility are key aspects. Microsoft’s cutting-edge technology is helping us lead this digital transformation by making it part of our DNA,” says Rajnish Kumar, Chairman, SBI.

On ground zero

At SBI’s Chandni Chowk branch, in the meanwhile, Gupta’s presentation went off better than expected. “The work in banking is the same, but the way we have do it is different. Everyone’s productivity has increased. We are doing the same work with more enthusiasm,” says a beaming Gupta.

As she prepares to leave for home to spend time with her daughter and family, we ask her what her ideal workplace of the future would be. She closes her eyes, “There may be a point in future, where I may close my eyes and imagine myself in office and I will be in office.”

Well, who knows what the future has in store?

SQL Server 2017 and Azure Data Services – The ultimate hybrid data platform

This post was authored by Rohan Kumar, General Manager Database Systems Engineering.

Today at PASS Summit 2017, we are showcasing new advances across SQL Server 2017 and our Azure data services. And we’re demonstrating how these products – both on-premises and in the cloud – come together to form the ultimate hybrid data platform. Our recent announcements, here at PASS Summit and at Ignite in September are great examples of how we’re empowering data professionals – like our community here at PASS Summit 2017 – to do more and achieve more.

With the recent launch of SQL Server 2017, the first database with AI built-in, you can now run your production workloads on Linux, Windows, and Docker. SQL Server 2017 delivers a mission-critical database with everything built-in, on the platform of your choice. And, it can unlock seamless DevOps with Docker Enterprise Edition containers, bringing efficiency and simplicity to your innovation. New features enable analysis of graph data, and advanced analytics using Python and R. We have incorporated your feedback to add features that will make your day-to-day job easier like Adaptive Query Processing and Automatic Plan Correction by finding and fixing performance regressions automatically.

In addition, SQL Server 2017 running on Windows and Linux can take advantage of new leaps forward in hardware. As demonstrated today by Bob Ward, SQL Server 2017 on SUSE Enterprise Linux Server on an HPE ProLiant DL380 Gen 10 Server with scalable persistent memory ran queries more than 5 times faster than a fast SSD drive array at 50% of the cost – making it the world’s first enterprise-grade diskless database server.

These new cross-platform capabilities have made SQL Server accessible to users of Windows, Linux and Mac. At PASS Summit, we are excited to provide a sneak peek at Microsoft SQL Operations Studio. In a few weeks, users will be able to download and evaluate this free, light-weight tool for modern database development and operations on Windows, Mac or Linux machines for SQL Server, Azure SQL Database, and Azure SQL Data Warehouse. Increase your productivity with smart T-SQL code snippets and customizable dashboards to monitor and quickly detect performance bottlenecks in your SQL databases on-premises or in Azure. You’ll be able to leverage your favorite command line tools like Bash, PowerShell, sqlcmd, bcp and ssh in the Integrated Terminal window. Users can contribute directly to SQL Operations Studio via pull requests from the GitHub repository.

For customers who are ready to modernize to the cloud, Azure SQL Database Managed Instance and Azure Database Migration Service, both now in private preview, making it easy to lift-and-shift your on-premises SQL Server workloads with few or no changes. The upcoming Azure Hybrid Benefit for SQL Server enables customers to use on-premises SQL Server licenses for the easiest lift and shift of SQL Server workloads to the fully-managed cloud.

Azure SQL Database is ready for your most mission-critical workloads. Today, we demonstrated the high scale and performance of SQL Database, with the ability to insert 1.4 million rows per second. In addition, we are making it easier than ever to gain insights from data at this scale. We recently made generally available, the ability to run advanced analytics models from Azure Machine Learning super-fast from T-SQL, with new Native T-SQL scoring. And in today’s demos, we show how you can use this new capability to score large amounts of data in real time – at an average of under 20ms per row!

Starting in early 2016, we have been delivering machine-learning based customer value directly into the Azure SQL Database managed service. More recently, we’ve delivered several intelligent capabilities including automatic tuning, performance monitoring and tuning, Adaptive Query Processing, and Threat Detection. These capabilities significantly reduce time requires to manage anywhere from one to thousands of databases and help proactively defend against potential threats. And the preview Vulnerability Assessment feature helps you more easily understand your security and compliance with new initiatives like GDPR.

In addition to sharing the same “everything built-in” SKU model with SQL Server for lower total cost of ownership versus competitors, Azure SQL Database adds value to your database with these built-in administration features. Now it’s easier than ever to move to Azure SQL Database with our new partnership with Attunity. Customers can now take advantage of Attunity Replicate for Microsoft Migrations, a free offer that accelerates migrations from various database systems, including Oracle, Amazon Redshift, and PostgreSQL to the Microsoft Data Platform.

To simplify analytics in the cloud, we’re releasing a public preview of new hybrid data integration capabilities in Azure Data Factory including the ability to run SSIS packages within the service. This means you can run your SSIS data integration workloads in Azure, without changing the packages – for true hybrid data integration across on-premises and cloud. And our SSIS partner technologies like Biml can now work to automate and enhance data integration across on-premises and cloud.

Dramatic scale investments are now in public preview for Azure SQL Data Warehouse. With the new Compute-Optimized Tier, you can get up to 100x the performance and 5x the scale. This new tier builds on the existing optimized for elasticity tier – giving customers the benefit of a fully-managed platform that suits the demands of their workload.

Visualizing data helps users across the organization take informed action. Microsoft delivers Business Intelligence capabilities to help customers model, analyze, and deliver business insights, which can be consumed by business users on mobile devices, on the web or embedded in applications.

Analysis Services helps you transform complex data into semantic models making it easy for business users to understand and analyze large amounts of data across different data sources. The same enterprise-grade SQL Server Analysis Services you’ve used on-premises is also available as a fully managed service in Azure. With today’s announcement of the Scale Out feature for Azure Analysis Services, you can easily set the number of replicas for your Azure Analysis Services instances to support large numbers of concurrent users with blazing fast query performance.

Power BI is a market leading SaaS service that is easy to get started and simple to use, with data connections to many on-premises and cloud data sources. It allows users across roles and industries to go from data to insights in minutes. A recent key addition to the Power BI portfolio is Power BI Report Server. Power BI Report Server enables self-service BI and enterprise reporting, all in one on-premises solution by allowing you to manage your SQL Server Reporting Services (SSRS) reports alongside your Power BI reports. Today we announce the availability of a new version of Power BI Report Server that will enable keeping Power BI reports on-premises that connect to any data! Read more on the Power BI blog.

Microsoft’s guiding principle has been to build the highest performing, most secure, and consistent data platform for all your applications across on-premises and cloud. By joining us on this journey, you can build upon your investments in SQL Server to expand the scope of your role in your organization, from database systems to advanced analytics and artificial intelligence. We look forward to working with you!

If you aren’t with us at PASS Summit 2017 this week, you can still see the announcements and demonstrations by purchasing session recordings to stream or download and watch at home. Sign up at the PASS Summit Website.

If you’d like to learn more about the latest enhancements in the Microsoft Data Platform, visit our data platform webpage, or data platform overview on Azure.

Challenge the status quo and reinvent the future of disability employment

By Jessica Rafuse, NGO Program Manager, Microsoft 


In celebration of National Disability Employment Awareness Month, Microsoft recently invited Seattle-area businesses, agencies, and NGO representatives for transparent conversations on the benefits, challenges and potential solutions regarding disability employment. The goal of the gathering was to bring together the community of people with disabilities and employers to challenge the status quo and reinvent the future of disability employment. Together, we have the power to make a difference to an unemployment rate that hasn’t materially changed in nearly 30 years.

Woman speaking to crowd

Jenny Lay-Flurrie, chief accessibility officer at Microsoft, welcomes the audience.

As an employment attorney by trade and former administrative judge, I have seen the worst of employment discrimination. As a person who uses a wheelchair, I have experienced the same biases, stigma, and misconceptions that feed the national epidemic of disability unemployment and underemployment. Yet, this often inaccessible world has made people with disabilities determined, adaptable, diverse, and excited to collaborate with employers to reinvent the future of disability employment. What if we all make a conscious effort to view disability as a strength that makes us the very best candidates for the job?

Microsoft Chief Accessibility Officer, Jenny Lay-Flurrie, kicked off the day with a high-energy account of our own journey at the company. Jenny candidly shared the history of disability inclusion and accessibility at Microsoft, including our stumbles along the way, as well as the successes that make us proud. Many of those moments are now captured in the Microsoft Disability Inclusion Journey, a Sway presentation outlining our programmatic approach to disability inclusion. 

Drive, determination, empathy and adaptability

The opening panel, comprised of business leaders with both apparent and non-apparent disabilities, highlighted the drive, determination, empathy, and adaptability that each panelist has developed because of their disabilities. The conversation was a celebration of disability as a strength and a sharing of individual experiences in the workplace. As Katherine Schomer, VP Research of Edelman Intelligence clearly articulated, “My disability has taught me to succeed.”

Woman in wheelchair speaking to audience

Katherine Schomer, VP Research of Edelman Intelligence, shares her thoughts during the opening panel.

The panelists concluded the conversation with a challenge to the employers, that they discuss with their employees early and often on how and when to support. Relevant advice for both employees with and without disabilities. The panelists remind us that managers of people with disabilities often develop the leadership skills that make them better managers to every employee.

Recruitment, Accommodations, and Employee Resource Groups

At the heart of the symposium, we responded to the need for more knowledge on how organizations are moving the needle in disability employment. As such, the breakout sessions proved to be a rich learning opportunity in Recruitment, Accommodations, and Employee Resource Groups (ERGs).

The key learnings we want to share with those looking to hire and retain people with disabilities are broken down by category below.

Recruitment

  • Partner with the community. If you’re unsure where to start, the best first step we can advise is to ask. There are nonprofits and agencies who can help you begin to find talent to build a pipeline of candidates. Another excellent resource is the team you already have: encourage people to reach out to their network and alert them to your company’s recruiting efforts.
  • Train your Talent Acquisition Team. Having a process in place and a list of resources will make the interview and onboarding processes for potential candidates go much more smoothly. But how about preparing your talent acquisition team? By providing the team with the resources, training and other relevant guidance, they will be able to offer accommodations and excellent customer service to candidates with disabilities.
  • Branding matters. Including people with disabilities in your marketing efforts will attract talent with disabilities. From words to images to how accessible your website is, including people of all abilities will attract a wider spectrum of candidates.

Accommodations

  • Ask the experts. Employers should tap into the expertise of employees with disabilities when evaluating the onboarding process, benefits, facilities, and corporate culture.
  • Be loud. Companies should share their policies and practices often, so that employees know where to go for accommodations and feel welcome disclosing their disabilities.
  • Design inclusively. Go beyond thinking about accommodations as programmatic solutions for an individual. Design your businesses inclusively from the start to benefit every person.

Employee Resource Groups

  • Align your strategy. Supporting the business in achieving its goals through the lens of disability inclusion will solidify the ERG’s role as a value-added partner to the business.
  • Build Accessibility Ambassadors. Encourage ERG members across the spectrum of disability to share best practices for disability inclusion, including physical and digital accessibility for all.
  • Gather the Community. Celebrate successes and explore solutions for challenges through meetings, webinars, and events.
Three woman speaking on panel

Liz Taub (center), Executive Vice President of USBLN, speaks on a panel.

Liz Taub of USBLN summarized, “When people feel valued and can bring their whole selves to work, they have permission to innovate. Some of the most popular and lucrative inventions have been started by people with disabilities.”

A man in a suit speaks to audience

Kenneth Price, Director of Aircraft Valuation at Boeing, shares his personal journey with a disability.

Our day of learning wrapped with a keynote speech from Kenneth Price, Director of Aircraft Valuation at Boeing, who shared his personal experience with both an apparent and non-apparent disability. Kenneth emphasized the importance of having the support of senior executives when building an inclusive corporate culture.

For employers and organizations looking to learn more about disability employment at your company, you can reach out to USBLN to connect with other companies engaged in the effort. Find out more about our efforts at Microsoft Inclusive Hiring or within the new Microsoft Disability Inclusion Journey.

Even though the symposium was a cause for celebration, there is still much work to be done. At Microsoft, we are excited to work together to have a positive impact on the unemployment and underemployment rates for people with disabilities.

Happy NDEAM Everyone!

Maxta hyper-converged MxSP helps services firm fuel growth

When Larry Chapman arrived at Trusource Labs as IT manager, the technical support services provider was in the hyper-growth stage, while its IT infrastructure was stuck in neutral.

The IT infrastructure consisted of no shared storage and a server with a single point of failure. Chapman decided to upgrade it in one shot with hyper-convergence. Trusource installed Maxta MxSP software-based hyper-convergence running on Dell PowerEdge servers last April when it opened a new call center.

The company started in 2013 with a call center in Austin, Texas, and added one in Limerick, Ireland, in the past year. It has since built a second call center in Austin, a 450-seat facility called “Austin North” to deal with the company’s rapid customer growth and for redundancy. Trusource plans further expansion with another call center set to open in Alpine, Texas, in 2018.

In four years, Trusource has grown to 600 employees and around $30 million in annual revenue.

“We were in hyper-growth mode from when we started until I got here,” said Chapman, who joined Trusource in mid-2016. He said when he arrived at Trusource the network consisted of “one big HP server with 40 VMs and 40 cores. Obviously, that’s a single point of failure; there was no shared storage and no additional servers.”

Chapman considered building his IT infrastructure out the traditional way, adding a dedicated storage array and more servers. But that would require adding at least one engineer to his small IT staff.

Forty minutes, start to finish, and boom, I was running hyper-converged infrastructure.
Larry ChapmanIT manager, Trusource Labs

“I wasn’t sure I wanted to hire a storage engineer to calculate LUNs and do all the storage stuff,” he said. “Over the course of years, there’s a lot of salary involved there. So I started looking at new next-generation things.”

That led him to hyper-converged infrastructure, which requires no storage specialists. He checked out HCI players Nutanix, SimpliVity and Maxta’s MxSP.

Chapman ruled out SimpliVity after Hewlett Packard Enterprise bought the startup in January 2017. He worried SimpliVity OmniStack software would no longer be hardware-agnostic after the deal closed.

“I like the option to be hardware-agnostic,” he said. “I will buy my server from whoever can give me the best deal at the time. At the time I looked at SimpliVity; it was hardware agnostic, but I didn’t think it would be in the future.”

He liked Nutanix’s appliance, but its initial cost scared him off. The price seemed especially steep compared to Maxta. Chapman chose Maxta’s freemium license option, which provides software for free and charges for maintenance. He said the Maxta hyper-converged MxSP price tag came to $54,000 for four three-node clusters. After the initial three years, he will pay $3,000 per server for support.

“I had to look at the quote a couple of times. I thought they left something off,” he said. He said a comparable set up with Nutanix would have cost around $150,000 just for the HCI appliances.

After selecting the Maxta hyper-converged software, Chapman priced servers, picking three Dell PowerEdge R530 models with 24-core processors, 120 GB of RAM and four 10 GigE interfaces for a total of $24,000. Each server has 800 GB solid-state drives for cache and six 1 TB hard disk drives in a hybrid setup.

Throw in switching and cabling, and Chapman said he ended up with his entire infrastructure for a 450-seat call center based on the Maxta hyper-converged MxSP software for $125,000.

Chapman said he was a bit leery of installing do-it-yourself software, but he followed Maxta’s  checklist and did it himself anyway. Installation went smoothly.

“Forty minutes, start to finish, and boom, I was running hyper-converged infrastructure,” he said.

As part of the setup process, Maxta hyper-converged MxSP asks how many copies of data to keep on the virtual machines. Chapman said he selected three copies across his three nodes, “so no matter what combination of things I lose, as long as I have two of the servers up, the VMs will run like nothing happened.”

That bailed him out when a parity error brought down a server, but no one even noticed until an alert went out. “Everything was still chugging along,” Chapman said. A firmware upgrade fixed the problem.

Trusource now runs its production workload on the Maxta MxSP HCI appliances.

He said Trusource does not use dedicated backup software for the Maxta hyper-converged cluster but replicates between data centers.

Chapman said his setup allows easy upgrades at no additional software cost because of the Maxta perpetual license.

“I will just take the server off line, shut it down, put a new server in, turn it on and repeat the process for each new server,” he said. “If I need bigger drives, I can just swap out drives while the system’s running. If I need more processing power, I just add another node in the cluster, another $8,000 server and I’m done.”

Hyper-V PowerShell commands for every occasion

You can certainly manage Hyper-V hosts and VMs with Hyper-V Manager or System Center Virtual Machine Manager, but…

“;
}
});

/**
* remove unnecessary class from ul
*/
$(“#inlineregform”).find( “ul” ).removeClass(“default-list”);

/**
* Replace “errorMessageInput” class with “sign-up-error-msg” class
*/
function renameErrorMsgClass() {
$(“.errorMessageInput”).each(function() {
if ($(this).hasClass(“hidden”)) {
$(this).removeClass(“errorMessageInput hidden”).addClass(“sign-up-error-msg hidden”);
} else {
$(this).removeClass(“errorMessageInput”).addClass(“sign-up-error-msg”);
}
});
}

/**
* when validation function is called, replace “errorMessageInput” with “sign-up-error-msg”
* before return
*/
function validateThis(v, form) {
var validateReturn = urValidation.validate(v, form);
renameErrorMsgClass();
return validateReturn;
}

/**
* DoC pop-up window js – included in moScripts.js which is not included in responsive page
*/
$(“#inlineRegistration”).on(“click”,”a.consentWindow”, function(e) {
window.open(this.href, “Consent”, “width=500,height=600,scrollbars=1”);
e.preventDefault();
});

in some cases, it’s easier to use PowerShell. With this scripting language and interactive command line, you can perform a number of actions, from simply importing a VM and performing a health check to more complex tasks, like enabling replication and creating checkpoints. Follow these five expert tips, and you’ll be well on your way to becoming a Hyper-V PowerShell pro.

Import and export Hyper-V VMs

If you need to import and export VMs and you don’t have the Hyper-V role installed, you can install Hyper-V PowerShell modules on a management server. To export a single VM, use the Export-VM command. This command creates a folder on the path specified with three subfolders — snapshots, virtual hard disks and VMs — which contain all of your VM files. You also have the option to export all of the VMs running on a Hyper-V host or to specify a handful of VMs to export by creating a text file with the VM names and executing a short script using that file. To import a single VM, use the Import-VM command. The import process will register the VM with Hyper-V and check for compatibility with the target host. If the VM is already registered, the existing VM with the same globally unique identifier will be deleted, and the VM will be registered again.

Check Hyper-V host and VM health

You can perform a complete health check for Hyper-V hosts and VMs by using PowerShell commands. When it comes to checking the health of your Hyper-V hosts, there are a lot of elements to consider, including the Hyper-V OS and its service pack, memory and CPU usages, Hyper-V uptime and total, used and available memory. If you want to perform a health check for a standalone host, you can use individual Hyper-V PowerShell commands. To perform a health check for a cluster, use Get-ClusterNode to generate a report. When performing a VM health check, consider the following factors: VM state, integration services version, uptime, whether the VM is clustered, virtual processors, memory configuration and dynamic memory status. You can use Get-VM to obtain this information and a script using the command to check the health status of VMs in a cluster.

Enable Hyper-V replication

Hyper-V replication helps keep VM workloads running in the event of an issue at the production site by replicating those workloads to the disaster recovery site and bringing them online there when need be. To configure Hyper-V replication, you need at least two Hyper-V hosts running Windows Server 2012 or later. There are a few steps involved, but it’s a pretty straightforward process. First, you need to run a script on the replica server to configure the Hyper-V replica and enable required firewall rules. Then, execute a script on the primary server to enable replication for a specific VM — we’ll name it SQLVM, in this case. Finally, initiate the replication with Start-VMInitialReplication –VMName SQLVM. After you’ve completed this process, the VM on the replica server will be turned off, while the one on the primary server will continue to provide services.

Create Hyper-V checkpoints

If you’d like to test applications or just play it safe in case a problem arises, enable Hyper-V checkpoints on your VMs so you can roll back changes to a specific point in time.

If you’d like to test applications or just play it safe in case a problem arises, enable Hyper-V checkpoints on your VMs so you can roll back changes to a specific point in time. The option to take point-in-time images is disabled by default, but you can enable it for a single VM with the following Hyper-V PowerShell command: Set-VM. In order to use production checkpoints, you’ll have to also configure the VM to do so. One you enable and configure the VM to use checkpoints, you can use CheckPoint-VM to create a checkpoint, and the entry will include the date and time it was taken. Unfortunately, the above command won’t work on its own to create checkpoints for VMs on remote Hyper-V hosts, but you can use a short script to create a checkpoint in this instance. To restore a checkpoint, simply stop the VM, and then use the Restore-VMSnapshot command.

Use Port ACL rules in Hyper-V

Port Access Control Lists (ACLs) are an easy way to isolate VM traffic from other VMs. To use this feature, you’ll need Windows Server 2012 or later, and your VMs must be connected to a Hyper-V switch. You can create and manage Port ACL rules using just a few Hyper-V PowerShell commands, but you need to gather some information first. Figure out the source of the traffic, the direction of the traffic — inbound, outbound or both — and whether you want to block or allow traffic. Then, you can execute the Add-VMNetworkAdapterACL command with those specific parameters. You can also list all of the Port ACL rules for a VM with the Get-VMNetworkAdapterACL command. To remove a Port ACL rule associated with a VM, use Remove-VMNetworkAdapterACL. As a time-saver, combine the two previous PowerShell cmdlets to remove all of the VM’s Port ACL rules.

Next Steps

Deep dive into Windows PowerShell

Manage cached credentials with PowerShell

Use PowerShell to enable automated software deployment

Use a performance counter to detect Hyper-V bottlenecks

Although Hyper-V Manager doesn’t provide any options for detecting bottlenecks on Hyper-V hosts and VMs, you can…

“;
}
});

/**
* remove unnecessary class from ul
*/
$(“#inlineregform”).find( “ul” ).removeClass(“default-list”);

/**
* Replace “errorMessageInput” class with “sign-up-error-msg” class
*/
function renameErrorMsgClass() {
$(“.errorMessageInput”).each(function() {
if ($(this).hasClass(“hidden”)) {
$(this).removeClass(“errorMessageInput hidden”).addClass(“sign-up-error-msg hidden”);
} else {
$(this).removeClass(“errorMessageInput”).addClass(“sign-up-error-msg”);
}
});
}

/**
* when validation function is called, replace “errorMessageInput” with “sign-up-error-msg”
* before return
*/
function validateThis(v, form) {
var validateReturn = urValidation.validate(v, form);
renameErrorMsgClass();
return validateReturn;
}

/**
* DoC pop-up window js – included in moScripts.js which is not included in responsive page
*/
$(“#inlineRegistration”).on(“click”,”a.consentWindow”, function(e) {
window.open(this.href, “Consent”, “width=500,height=600,scrollbars=1”);
e.preventDefault();
});

use third-party tools to detect issues related to network, storage, CPU and memory. Another option is to use the performance counters that Windows OS ships with to detect bottlenecks on Hyper-V hosts.

There are various performance counters available to find issues on Hyper-V hosts and VMs, depending on the issue you’re facing. For example, if a Hyper-V host isn’t operating normally or takes too much time responding to Hyper-V calls from VMs and remote machines, you might want to use the Hyper-V Hypervisor Logical Processor (_Total)% Total RunTime performance counter to ensure the Hyper-V host has enough processing power available to process requests quickly. If the logical processor runtime count value is above 85%, the Hyper-V host is overloaded and requires immediate attention.

Similarly, if you need to check whether the memory assigned to VMs is sufficient, you can use the MemoryAvailable Mbytes performance counter. If the available memory value shows low consistently, you might want to assign more memory to VMs or increase the maximum memory setting if you’re using dynamic memory.

To detect storage latencies or troubleshoot storage-related issues in Hyper-V, use physical disk performance counters, such as Physical DiskAvg. Disk Sec/Read, Physical DiskAvg. Disk sec/Write, Physical DiskAvg. Disk read queue length and Physical DiskAvg. Disk write queue length. If you find greater storage latencies, you can buy additional or fast storage or move VMs to available storage. You can also enable Storage Quality of Service if the Hyper-V host is running on Windows Server 2012 or later OSes, which allows you to fine-tune storage policies for VMs.

There are various performance counters available to find issues on Hyper-V hosts and VMs, depending on the issue you’re facing.

To detect network bottlenecks, there are two performance counters available: Physical NIC Bytes/Sec, used to detect network performance for the Hyper-V host, and the Hyper-V Virtual Network Adapter Bytes/Sec performance counter, which can be used to see how a VM network is performing.

You can use the above performance counters to detect bottlenecks in various Hyper-V components, which ultimately helps you get to the root cause of the problem.

Next Steps

Use these Hyper-V performance-tuning tips

Improve VM networking performance

Develop a VM load-balancing strategy to avoid mistakes

Dig Deeper on Microsoft Hyper-V management

Have a question for an expert?

Please add a title for your question

Get answers from a TechTarget expert on whatever’s puzzling you.