Tag Archives: 2019

AppExchange, acquisitions key to the future of Salesforce

If numbers such as $13.28 billion fiscal 2019 revenue or 171,000 Dreamforce attendees last month are any indication, Salesforce nailed the tech side of building a wildly loyal customer following for its sales, marketing, customer service and e-commerce clouds during its first 20 years.

For the next two decades, it will take continuous technology innovation, especially in the areas of cloud integration, AI and voice to prevent those customers from defecting to Adobe, Microsoft, SAP and Oracle platforms. Far more important to the future of Salesforce,  employees, customers and analysts said, is growing a Salesforce talent pool beyond the company’s control: partners, developers, admins and consultants.

To woo partners, Salesforce opened its platform. It hosts the AppExchange, a third-party marketplace similar to the Apple App Store or Google Play. Lightning Platform, a low-code appdev environment launched in 2009 as Force.com, enables individual users to create low-code apps and integrations themselves. Finally, Trailhead, a free, self-paced Salesforce training site, debuted in 2014; it has attracted 1.7 million people to learn developer, admin and consultant skills.

Yet it’s not enough. Salesforce developer and admin talent are in short supply. They will get even shorter if the company realizes CEO and founder Marc Benioff’s oft-stated revenue targets of $20 billion by 2022 and $60 billion by 2034 as more customers come to Salesforce.

“Salesforce’s biggest innovation is building this open community, whether it’s admins and recognizing how crucial they are, or creating Force.com and encouraging other developers to come in and develop on their platform,” said Nicole France, an analyst at Constellation Research. “Going forward, the challenge will be keeping up with the pace of innovation — it’s a lot harder when you’re a behemoth company.”

Salesforce's 20-year boom

AppExchange, Dreamforce built over many years

When Salesforce first started, what we call cloud companies today were referred to as application service providers. Salesforce’s big innovation was building an entire platform in the cloud instead of just one app, said Michael Fauscette, an analyst at G2.

When Salesforce first got into the enterprise, they didn’t go in the traditional way. IT bought tech — except for Salesforce automation. It came in through the sales guy.
Michael FauscetteAnalyst, G2

“Salesforce, and NetSuite, too, really had this idea of scaling infrastructure up and down really quickly with multi-tenancy, according to need,” Fauscette said, which found a different buying audience. “When Salesforce first got into the enterprise, they didn’t go in the traditional way. IT bought tech — except for Salesforce automation. It came in through the sales guy. They could just start using Salesforce immediately.”

Quickly, though, Salesforce knew it couldn’t keep up with every individual customer’s tech needs, especially integrations with outside business applications. So, in 2006, it threw open its platform to third-party developers by introducing the AppExchange, which provided sales teams with tools to integrate Salesforce with applications such as calendars, email, accounting, HR and ERP. Today, AppExchange hosts 3,400 apps.

Force.com, now called Lightning Platform, came along two years later, and enabled individual developers or even nondevelopers to build their own apps and connectors among Salesforce and other apps.

The AppExchange evolved into a Salesforce revenue generator in several ways, said Woodson Martin, executive vice president and general manager of Salesforce AppExchange. First, Salesforce earns revenue when an app is sold. Second, AppExchange enables customers to use Salesforce to grow their companies and, in turn, increase their Salesforce subscription. Third, it generates new leads for Salesforce when a developer creates a connector to a vertical-specific app.

“We think of AppExchange as the hub of the Salesforce ecosystem,” Martin said. “In some cases, apps are the tip of the spear for new industry verticals.”

G2’s Fauscette said that shuttling data between clouds, and between clouds and on-premises systems, will require more and more integrations between Salesforce and outside applications for at least the next decade. That makes AppExchange a crucial part of the future of Salesforce.

Acquisitions give partners new opportunities

Moving forward, AppExchange will expand into new domains, Martin said, as Salesforce integrates features and capabilities from companies it acquired, including Tableau and MuleSoft, into its platform. That will create opportunities for developers to create new customizations for data visualizations and data integrations.

Martin also said that Salesforce closely watches technology trends in the consumer retailing and e-commerce space — personalization and AI are two recent examples — to bring to its B2B platform. That’s what customers want, he said: a B2B buying experience that works as well as Amazon does at home.

But it takes outside developers to buy into the AppExchange concept, and so far, they seem rosy on the future of Salesforce. AppExchange partners such as configure-price-quote (CPQ) provider Apttus generally believe there’s room for developers of all stripes to grow their own franchises, even when Salesforce adds native overlapping features that directly compete.

That happened when Salesforce acquired Apttus competitor SteelBrick and added Salesforce-native CPQ three years ago, said Eric Carrasquilla, senior vice president of product at Apttus. That’s because Salesforce has hundreds of thousands of CRM customers now — and the number keeps increasing.

“Salesforce is a force of nature,” Carrasquilla said, adding that Apttus and Salesforce CPQ have roughly 3,500 customers combined. “That’s still a fraction of a fraction of a fraction of the opportunity within the CRM market. It’s a very deep pool, businesswise, and there’s more than enough for everyone in the ecosystem.”

Read how Trailblazers also figure heavily into the future of Salesforce in the second part of this story.

Go to Original Article
Author:

Windows 10 issues top list of most read stories for IT pros

Windows 10 — and the challenges posed to IT professionals by its updates — dominated the enterprise desktop discussion in 2019. Troubleshooting and understanding the eccentricities of 2019’s Windows 10 issues comprised many of our top 10 most popular stories this year.

With the sunset of Windows 7 scheduled for the first month of 2020, interest in other Microsoft OSes, including Windows 10, may intensify in the coming year.

Below is a countdown of the top ten most-read SearchEnterpriseDesktop stories, based on page views.

  1. Micro apps, AI to power new version of Citrix Workspace

Citrix announced a new version of Citrix Workspace, which enables IT admins to provide employees with virtual access to an organization’s desktop and applications, at May’s Citrix Synergy event in Atlanta. The company cited micro apps or small, task-based applications as a key feature, saying they would handle complicated tasks more efficiently by bringing them into a unified work feed. The addition of micro apps was made possible through a $200 million acquisition of Sapho in 2018.

  1. Lenovo to launch ThinkBook brand, next-gen ThinkPad x1

Lenovo started a new subbrand — called ThinkBook — this past spring, with two laptops aimed at younger employees in the workforce. The 13- and 14-inch laptops were intended to incorporate a sleek design with robust security, reliability and support services. The company also launched a laptop for advanced business users, ThinkPad X1 Extreme Gen 2, and the ultrasmall desktop ThinkCentre M90n-1 Nano in the same time frame.

  1. Learn about the device-as-a-service model and its use cases

The device-as-a-service model, in which a vendor leases devices to a business, may help IT admins fulfill their responsibility to support, maintain and repair equipment. The model has its pros and cons. It can provide a single point of contact for troubleshooting and enable more frequent hardware refreshes, but it can also limit an organization’s device choices and pose complications for a company’s BYOD plan.

  1. Lenovo powers new ThinkPad series with AMD Ryzen Pro processors

Lenovo released three Windows 10 laptops with AMD processors this past spring, the first time it has used non-Intel chips in its higher-end ThinkPad T and X series devices. The company hoped its T495, T495s and X395 computers would provide better performance and security at a lower cost; the company said the AMD-powered T and X series laptops saw an 18% increase over the previous generation.

  1. Windows 10 security breach highlights third-party vulnerabilities

Microsoft detected a security vulnerability in Windows 10, introduced through Huawei PCManager driver software. Microsoft Defender Advanced Threat Protection, a feature that finds and blocks potential compromises, found the problem before the vulnerability could cause serious damage, but industry professionals said the incident highlighted the risks posed by third-party kernels such as device drivers and the importance of working with trusted companies.

  1. Samsung Notebook 9 Pro 2-in-1 impresses with specs and looks

Samsung released a redesign of its flagship Windows 10 laptop this year, opting for an aluminum chassis in place of the plastic from previous iterations. The device offered comparable specs to other high-end laptop offerings, with a slate of features including a backlit keyboard, a variety of inputs and the Samsung Active Pen.

  1. With the new Windows 10 OS update, trust but verify

Dave Sobel, senior director and managed services provider at SolarWinds in Austin, Texas, expounded on the then-forthcoming May 2019 Windows 10 update a month before its scheduled release. Sobel acknowledged the security importance of patching systems but stressed that IT professionals remain vigilant for complications — notable, as the Windows 10 update came in the wake of an October 2018 patch that deleted files of users who work with Known Folder redirection.

  1. Citrix CEO David Henshall addresses Citrix news, sale rumors

In a Q&A, Citrix CEO David Henshall talked about the future of the 30-year-old company, downplaying rumors that it would be sold. Henshall spoke of the venerable firm’s history of connecting people and information on demand and saw the coming years as a time when Citrix would continue to simplify and ease that connection to encourage productivity.

  1. Latest Windows 10 update issues cause more freezing problems

The April 9 Windows 10 update caused device freezing upon launch. Those in IT had already noted freezing in devices using Sophos Endpoint Protection; after a few days, they learned that the patch was clashing with antivirus software, causing freezing both during startup and over the course of regular operation of the computer. Microsoft updated its support page to acknowledge the issue and provided workarounds shortly thereafter.

  1. 1. IT takes the good with the bad in Windows 10 1903 update

After experiencing problems with previous Windows 10 updates, June’s 1903 version came with initial positive — but wary — reception. Microsoft’s Windows-as-a-service model drew complaints for the way it implemented updates. Among its changes, 1903 enabled IT professionals to pause feature and monthly updates for up to 35 days. Also new was Windows Sandbox, providing IT with the ability to test application installations without compromising a machine. The new version of Windows 10 did not launch bug-free, however; issues with Wi-Fi connectivity, Bluetooth device connection and USB devices causing the rearrangement of drive letters were reported.

Go to Original Article
Author:

HCI storage adoption rises as array sales slip

The value and volume of data keep growing, yet in 2019 most primary storage vendors reported a drop in sales.

Part of that has to do with companies moving data to the cloud. It is also being redistributed on premises, moving from traditional storage arrays to hyper-converged infrastructure (HCI) and data protection products that have expanded into data management.

That helps explain why Dell Technologies bucked the trend of storage revenue declines last quarter. A close look at Dell’s results shows its gains came from areas outside of traditional primary storage arrays that have been flat or down from its rivals.

Dell’s storage revenue of $4.15 billion for the quarter grew 7% over last year, but much of Dell’s storage growth came from HCI and data protection. According to Dell COO Jeff Clarke, orders of VxRail HCI storage appliances increased 82% over the same quarter in 2018. Clarke said new Data Domain products also grew significantly, although Dell provided no revenue figures for backup.

Hyper-converged products combine storage, servers and virtualization in one box. VxRail, which relies on vSAN software from Dell-owned VMware running on Dell PowerEdge, appears to be cutting in on sales of both independent servers and storage. Dell server revenue declined around 10% year-over-year, around the same as rival Hewlett Packard Enterprise’s (HPE) server decline.

“We’re in this data era,” Clarke said on Dell’s earnings call last week. “The amount of data created is not slowing. It’s got to be stored, which is probably why we are seeing a slightly different trend from the compute side to the storage side. But I would point to VxRail hyper-convergence, where we’ll bring computing and storage together, helping customers build on-prem private clouds.”

The amount of data created is not slowing. It’s got to be stored.
Jeff ClarkeCOO, Dell

Dell is counting on a new midrange storage array platform to push storage revenue in 2020. Clarke said he expected those systems to start shipping by the end of January.

Dell’s largest storage rivals have reported a pause in spending, partially because of global conditions such as trade wars and tariffs. NetApp revenues have fallen year-over-year each of the last three quarters, including a 9.6% dip to $1.38 billion last quarter. HPE said its storage revenue of $848 million dropped 12% from last year. HPE’s Nimble Storage midrange array platform grew 2% and Simplivity HCI increased 14% year-over-year, a sign that 3PAR enterprise arrays fell and the vendor’s new Primera flagship arrays have not yet generated meaningful sales.

Jeff Clarke, Dell COO
Dell Technologies COO Jeff Clarke

IBM storage has also declined throughout the year, dropping 4% year-over-year to $434 million last quarter. Pure Storage’s revenue of $428 million last quarter increased 16% from last year, but Pure had consistently grown revenue at significantly higher rates throughout its history.

Meanwhile, HCI storage revenue is picking up. Nutanix last week reported a leveling of revenue following a rocky start to 2019. Related to VxRail’s increase, VMware said its vSAN license bookings had increased 35%. HPE’s HCI sales grew, while overall storage dropped. Cisco did not disclose revenue for its HyperFlex HCI platform, but CEO Chuck Robbins called it out for significant growth last quarter.

Dell/VMware and Nutanix still combine for most of the HCI storage market. Nutanix’s revenue ($314.8 million) and subscription ($380.0 million) results were better than expected last quarter, although both numbers were around the same as a year ago. It’s hard to accurately measure Nutanix’s growth from 2018 because the vendor switched to subscription billing. But Nutanix added 780 customers and its 66 deals of more than $1 million were its most ever. And the total value of its customer contracts came to $305 million, up 9% from a year ago.

Nutanix’s revenue shift came after the company switched to a software-centric model. It no longer records revenue from the servers it ships its software on. Nutanix and VMware are the dominant HCI software vendors.

“It’s just the two of us, us and VMware,” Nutanix CEO Dheeraj Pandey said in an interview after his company’s earnings call. “Hyper-convergence now is really driven by software as opposed to hardware. I think it was a battle that we had to win over the last three or four years, and the dust has finally settled and people see it’s really an operating system play. We’re making it all darn simple to operate.”

Go to Original Article
Author:

AT&T integrating 5G with Microsoft cloud to enable next-generation solutions on the edge – Stories

DALLAS and REDMOND, Wash. — Nov. 26, 2019 — Microsoft and AT&T are ramping up innovation in the early days of their strategic alliance announced in July. One area of focus is aimed at enabling new 5G, cloud and edge computing solutions to drive enterprise capabilities for companies around the world.

The companies are opening select preview availability for Network Edge Compute (NEC) technology, which weaves Microsoft Azure cloud services into AT&T network edge locations closer to customers. This means AT&T’s software-defined and virtualized 5G core – what the company calls the Network Cloud – is now capable of delivering Azure services. NEC will initially be available for a limited set of select customers in Dallas. Next year, Los Angeles and Atlanta are targeted for select customer availability.

From making the world’s first 5G millimeter wave browsing session on a commercial 5G device to groundbreaking commercial installations in healthcare, manufacturing and entertainment, AT&T has proved itself to be a leader in 5G. The company recently activated an industry-first 400-gigabit connection between Dallas and Atlanta to support video, gaming and other 5G needs. AT&T serves parts of 21 cities with its 5G network using millimeter wave spectrum (5G+) and plans to offer nationwide 5G in the first half of 2020.

“The first smartphones on 3G networks introduced the idea of mobile apps over a decade ago. A few years later, 4G LTE made it feasible to connect those devices faster to cloud applications to stream videos, hail rides, and broadcast content to the world,” said Mo Katibeh, EVP and chief marketing officer, AT&T Business. “With our 5G and edge computing, AT&T is collaborating uniquely with Microsoft to marry their cloud capabilities with our network to create lower latency between the device and the cloud that will unlock new, future scenarios for consumers and businesses. We’ve said all year developers and businesses will be the early 5G adopters, and this puts both at the forefront of this revolution.”

This innovation points to a future where high-end augmented reality glasses are as thin and stylish as a standard pair of eyeglasses, lightweight drones can track themselves and thousands of nearby companions in near-real time, and autonomous cars have access to nearly-instant data processing capabilities without having to install a mini data center in the trunk.

“We are helping AT&T light up a wide range of unique solutions powered by Microsoft’s cloud, both for its business and our mutual customers in a secure and trusted way,” said Corey Sanders, corporate vice president, Microsoft Solutions. “The collaboration reaches across AT&T, bringing the hyperscale of Microsoft Azure together with AT&T’s network to innovate with 5G and edge computing across every industry.”

 5G and edge for gaming, drones, and more

One example of how edge computing can unlock new scenarios and experiences is in mobile gaming, where gaming company Game Cloud Network has created a unique 5G game that’s hosted on the network edge with Microsoft Azure. Game Cloud Network is a pioneer in developing game-based brand engagement and a customer of AT&T. The company is now showcasing its new “Tap & Field” game, which utilizes Microsoft’s Azure PlayFab services. In the game, users race each other in near-real time via this track-and-field-style game, enabled by the speed of 5G-connected devices.

“5G gaming provides consumers with the best of both worlds: highly-immersive experiences on lightweight mobile devices,” said Aaron Baker, chief executive officer, Game Cloud Network. “AT&T and Microsoft are building the perfect environment for game developers to create amazing new possibilities for gamers. 5G and edge computing have the potential to radically change how we play together and launch new business opportunities for brands and game publishers.”

Through AT&T Foundry, AT&T and Microsoft are exploring proofs-of-concept including augmented and virtual reality scenarios and drones. For example, both companies continue to work with Israeli startup Vorpal, helping its VigilAir product track drones in commercial zones, airports, and other areas with near-instant positioning. The companies also recently demoed using Microsoft HoloLens to provide 3D schematic overlays for technicians making repairs to airplanes and other industrial equipment.

Progress toward a “public-cloud first company” and more

Microsoft is also helping AT&T Communications become a “public-cloud first” company by migrating most non-network workloads to the public cloud by 2024, and this migration to Azure is already underway. Another important part of AT&T’s strategy is to empower much of its workforce with Microsoft 365. This includes cloud-connected Office apps on Windows 10, and modern collaboration with Microsoft Teams, SharePoint and OneDrive. AT&T has begun rolling out these solutions to tens of thousands of employees to help drive a culture of modern work.

AT&T and Microsoft will have more to share over the coming months and years as this unique alliance continues to evolve and expand. The two companies will both create and adopt new technologies to develop tools, commercial services and consumer applications that benefit everyone.

About AT&T

AT&T Inc. (NYSE: T) is a diversified, global leader in telecommunications, media and entertainment, and technology. It executes in the market under four operating units. WarnerMedia is a leading media and entertainment company that creates and distributes premium and popular content to global audiences through its consumer brands including: HBO, Warner Bros., TNT, TBS, truTV, CNN, DC Entertainment, New Line, Cartoon Network, Adult Swim, Turner Classic Movies and others. AT&T Communications provides more than 100 million U.S. consumers with entertainment and communications experiences across TV, mobile and broadband services. Plus, it serves nearly 3 million business customers with high-speed, highly secure connectivity and smart solutions. AT&T Latin America provides pay-TV services across 11 countries and territories in Latin America and the Caribbean, and is the fastest growing wireless provider in Mexico, serving consumers and businesses. Xandr provides marketers with innovative and relevant advertising solutions for consumers around premium video content and digital advertising through its AppNexus platform.

AT&T products and services are provided or offered by subsidiaries and affiliates of AT&T Inc. under the AT&T brand and not by AT&T Inc. Additional information is available at about.att.com. © 2019 AT&T Intellectual Property. All rights reserved. AT&T, the Globe logo and other marks are trademarks and service marks of AT&T Intellectual Property and/or AT&T affiliated companies. All other marks contained herein are the property of their respective owners.

Cautionary Language Concerning Forward-Looking Statements

 Information set forth in this news release contains financial estimates and other forward-looking statements that are subject to risks and uncertainties, and actual results might differ materially. A discussion of factors that may affect future results is contained in AT&T’s filings with the Securities and Exchange Commission. AT&T disclaims any obligation to update and revise statements contained in this news release based on new information or otherwise.

This news release may contain certain non-GAAP financial measures. Reconciliations between the non-GAAP financial measures and the GAAP financial measures are available on the company’s website at https://investors.att.com.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, contact:

Clay Owen
AT&T Corporate Communications
Phone: (404) 538-0124
Email: [email protected]

Microsoft Media Relations
WE Communications for Microsoft
Phone: (425) 638-7777
Email: [email protected]

Go to Original Article
Author: Microsoft News Center

For Sale – 13″ MacBook Pro with Touch Bar (2019) – 128 GB SSD, Space Grey

Hi Guy’s,

For sale I have a 2019 13″ Macbook Pro with the Currys 3 year warranty (which includes accidental damage etc). Specification as found here: APPLE 13″ MacBook Pro with Touch Bar (2019) – 128 GB SSD, Space Grey

Initially purchased for my partner and her job at a local school. However out of the blue (and after 11 years) they have actually provided her with a pretty decent windows alternative which she is happy to use.

It’s only 8/9 weeks old. Hasn’t had much use at all. Approx 8 charge cycles (forgot to check before resetting sorry). But it really hasn’t been touched a great deal at all. It’s boxed and essentially looks ‘as new’ with not a mark or blemish.

In total we paid £1450 for the package including a trade in, plus the 3 year warranty as mentioned above. A great little machine that I have thought about keeping but it would only collect dust as I personally use Windows / Android. So thought I would offer it up here first.

Any questions, please feel free to ask away. Many thanks…

Go to Original Article
Author:

Effectively implement Azure Ultra Disk Storage

In August 2019, Microsoft announced the general availability of a new Managed Disks tier: Ultra Disk Storage. The new offering represents a significant step up from the other Managed Disks tiers, offering unprecedented performance and sub-millisecond latency to support mission-critical workloads.

The Ultra Disk tier addresses organizations reluctant to move data-intensive workloads to the cloud because of throughput and latency requirements.

According to Microsoft, Azure Ultra Disk Storage makes it possible to support these workloads by delivering next-generation storage technologies geared toward performance and scalability, while providing you with the convenience of a managed cloud service.

Understanding Azure Ultra Disk

Managed Disks is an Azure feature that simplifies disk management for infrastructure-as-a-service storage. A managed disk is a virtual hard disk that works much like a physical disk, except that the storage is abstracted and virtualized. Azure stores the disks as page blobs, in the form of random I/O storage objects.

To use managed disks, you only have to provision the necessary storage resources and Azure does the rest, deploying and managing the drives.

Azure offers four Managed Disks tiers: Standard HDD, Standard SSD, Premium SSD and the new Ultra Disk Storage, which also builds on SSD technologies. Ultra Disk SSDs support enterprise-grade workloads driven by systems such as MongoDB, SQL Server, SAP HANA and high-performing, mission-critical applications. The latest storage tier comes with configurable performance attributes, making it possible to adjust IOPS and throughput to meet evolving performance requirements.

Azure Ultra Disk Storage implements a distributed block storage architecture that uses NVMe to support I/O-intensive workloads. NVMe is a host controller interface and storage protocol that accelerates data transfers between data center systems and SSDs over a computer’s high-speed PCIe bus.

Ultra Disk Storage makes it possible to utilize a VM’s maximum I/O limits using only a single ultra disk, without needing to stripe multiple disks.

Along with the new storage tier, Azure introduced the virtual disk client (VDC), a simplified client that runs on the compute host. The client has full knowledge of the virtual disk metadata mappings in the Azure Ultra Disk cluster. This knowledge enables the client to communicate directly with the storage servers, bypassing the load balancers and front-end servers often used to establish initial disk connections.

With earlier Managed Disk storage tiers, the route was much less direct. For example, Azure Premium SSD storage is dependent on the Azure Blob storage cache. As a result, the compute host runs the Azure Blob Cache Driver, rather than the VDC. The driver communicates with a storage front end, which, in turn, communicates with partition servers. The partition servers then talk to the stream servers, which connect to the storage devices.

The VDC, on the other hand, supports a more direct connection, minimizing the number of layers that read and write operations traverse, reducing latency and increasing performance.

Deploying Ultra Disk Storage

Azure Ultra Disk Storage lets you configure capacity, IOPS and throughput independently, providing the flexibility necessary to meet specific performance requirements. For capacity, you can choose a disk size ranging from 4 GiB to 64 TiB, and you can provision the disks with up to 300 IOPS per GiB, to a maximum of 160,000 IOPS per disk. For throughput, Azure supports up to 2,000 MB per second, per disk.

Ultra Disk Storage makes it possible to utilize a VM’s maximum I/O limits using only a single ultra disk, without needing to stripe multiple disks. You can also configure disk IOPS or throughput without detaching the disk from the VM or restarting the VM. Azure automatically implements the new performance settings in less than an hour.

To deploy Ultra Disk Storage, you can use the Azure Resource Manager, Azure CLI or PowerShell. Ultra Disk Storage is currently available in three Azure regions: East US 2, North Europe and Southeast Asia. Microsoft plans to extend to other regions, but the company has not provided specific timelines. In addition, Ultra Disk Storage supports only the ESv3 and DSv3 Azure VMs.

Azure Ultra Disk handles data durability behind the scenes. The service is built on Azure’s locally redundant storage (LRS), which maintains three copies of the data within the same availability zone. If an application writes data to the storage service, Azure will acknowledge the operation only after the LRS system has replicated the data.

When implementing Ultra Disk Storage, you must consider the throttling limits Azure places on resources. For example, you could configure your VM with a 16-GiB ultra disk at 4,800 IOPS. However, if you’re working with a Standard_D2s_v3 VM, you won’t be able to take full advantage of the storage because the VM gets throttled to 3,200 IOPS as a result of its limitations. To realize the full benefits available to Ultra Disk Storage, you need hardware that can support its capabilities.

Where Ultra Disk fits in the Managed Disk lineup

Azure Managed Disks simplify disk management by handling deployment and management details behind the scenes. Currently, Azure provides the following four storage options for accommodating different workloads.

The Standard HDD tier is the most basic tier, providing a reliable, low-cost option that supports workloads in which IOPS, throughput and latency are not critical to application delivery. For this reason, the Standard HDD tier is well suited to backup and other non-critical workloads. The maximum disk size for this tier is 32,767 GiB, the maximum IOPS is 2,000 and the maximum throughput is 500 MiB per second.

The Standard solid-state drive tier offers a step up from the Standard HDD tier to support workloads that require better consistency, availability, reliability and latency. The Standard SSD tier is well suited to web servers and lightly used applications, as well as development and testing environments. The maximum disk size for this tier is 32,767 GiB, the maximum IOPS is 6,000 and the maximum throughput is 750 MiB per second.

Prior to the release of the Ultra Disks tier, the Premium SSD tier was the top offering in the Managed Disks stack. The Premium tier is geared toward production and performance-sensitive workloads that require greater performance than the lower tiers. This tier can benefit mission-critical applications that support I/O-intensive workloads. The maximum disk size for this tier is 32,767 GiB, the maximum IOPS is 20,000 and the maximum throughput is 900 MiB per second.

The Ultra Disks tier is the newest Managed Disks service available to customers. The new tier takes performance to the next level, delivering high IOPS and throughput, with consistently low latency. Customers can dynamically change performance settings without restarting their VMs. The Ultra Disks tier targets data-intensive applications such as SAP HANA, Oracle Database and other transaction-heavy workloads. The maximum disk size for this tier is 65,536 GiB, the maximum IOPS is 160,000 and the maximum throughput is 2,000 MiB per second.

Because Ultra Disk Storage is a new Azure service, it comes with several limitations. The service is available in only a few regions and works with only a couple types of VMs. Additionally, you cannot attach an ultra disk to a VM running in an availability set. The service also does not support snapshots, VM scale sets, Azure disk encryption, Azure Backup or Azure Site Recovery. You can’t convert an existing disk to an ultra disk, but you can migrate the data from an existing disk to an ultra disk.

Despite these limitations, Azure Ultra Disk Storage could prove to be an asset to organizations that plan to move their data-intensive applications to the cloud. No doubt Microsoft will continue to improve the service, extending their reach to other regions and addressing the lack of support for other Azure data services, but that hasn’t happened yet, and some IT teams might insist that these issues be resolved before they consider migrating their workloads. In the meantime, Ultra Disk Storage promises to be a service worth watching, especially for organizations already committed to the Azure ecosystem.

Go to Original Article
Author:

Microsoft scientist accepts Hamburg Prize for Theoretical Physics for quantum contributions – Microsoft Quantum

This week, Dr. Matthias Troyer, a Distinguished Scientist at Microsoft, accepted the 2019 Hamburg Prize for Theoretical Physics – one of the most valuable German prizes in the field – for his groundbreaking contributions to the development of quantum Monte Carlo algorithms.

“In Professor Troyer, we are honoring a scientist whose work connects myriad areas of physics and computer science. On account of his current research in the field of quantum computing, he partners with universities and companies in the US and around the world. He has also set up an open-source platform in order to share his knowledge. By awarding the prize to Professor Troyer, we also wish to recognize this contribution to collaborative research,” explained Dr. Nina Lemmens, Member of the Executive Board of the Joachim Herz Stiftung.

Dr. Troyer works at the interface between computer science and theoretical physics and is one of just a handful of leading international researchers in this field. Monte Carlo algorithms can predict how tiny particles will interact within quantum mechanical many-body systems such as atoms and molecules, and Dr. Troyer’s work in this area is playing a key role in the research and ongoing development of quantum computers and superconducting materials.

When asked about what this honor means to him, Dr. Troyer said, “One reason I came to Microsoft and why I want to build a quantum computer is that when inventing these Monte Carlo methods, we made big breakthroughs, but we also encountered a fundamental problem of Monte Carlo simulations of quantum systems, the so-called  ‘sign problem.’ The workaround becomes exponentially difficult; a quantum computer will help us move past these barriers.”

With the recent Microsoft announcement of Azure Quantum, teams will soon be able to experiment running algorithms like Monte Carlo against both classical hardware in Azure and quantum hardware from partners, knowing these solutions will scale to future quantum systems as well.

The prize not only comes with a grant, but also entails research visits to Hamburg that will see Dr. Troyer give talks and work closely with doctoral candidates, postdocs, and other colleagues.

Dr. Troyer continued, “I’m looking forward to engaging the academic community in discussing and further advancing what we can do with quantum computing. As we think of quantum algorithms for material science, what problems can we solve now with quantum simulations? And how do we develop quantum algorithms to run once we have a fully scalable quantum computer?”

“The connection to Hamburg means that we can engage with the academic and scientific communities, and with that, I look forward to talking to the people in Hamburg – and around the world – about applying quantum systems and quantum computing to make an impact on material science problems.”

Microsoft and the Azure Quantum team congratulate Dr. Troyer on this significant recognition, and we look forward to supporting his important work in making an impact in solving some of the world’s toughest challenges with quantum computing.

Go to Original Article
Author: Microsoft News Center

For Sale – Apple MacBook Pro 13.3″ Latest Edition 2019 (512GB,i5,16GB) Silver – New

Apple MacBook Pro 13.3″ 2019 (512GB, Intel Core i5, 16GB) Silver – New . Condition is New.

Still in Cellophane! Top Specs!

Brand New from Apple, latest version!

Silver 2.4GHz quad-core 8th‑generation Intel Core i5 processor, Turbo Boost up to 4.1GHz

Retina display with True Tone
Touch Bar and Touch ID
Intel Iris Plus Graphics 655
16GB 2133MHz LPDDR3 memory
512GB SSD storage
Four Thunderbolt 3 ports
Backlit Keyboard – British

Cash on collection.

Cheaper than buying direct from Apple.

Go to Original Article
Author:

The Big Announcements from Microsoft Ignite 2019

Yesterday wrapped up Day 1 of the 2019 Microsoft Ignite event. As always there was no shortage of announcements for IT professionals of all walks of life. In this blog post we’re going to talk about some of the bigger announcements that have impacts in the infrastructure space, as well as office services as well. So let’s dive in!

Azure Arc

There is no doubt that Azure Arc was the headline of the show for many attendees. In short, Azure Arc is designed to be a control plane for Multi-Cloud / Multi-Edge Deployments. Meaning, single-pane-of-glass management for all computing resources regardless of where they live. Many organizations have compute resources in many locations, including on-premises, Azure, other cloud environments…etc..etc, and the big challenge has always been maintaining effective management of these disparate systems. Azure Arc is designed to address this issue.

With Azure Arc, your on-premises resources actually appear in the Azure Console and can be managed in much the same way as you would manage a VM running in Azure. The on-prem resources could then be integrated with Azure services such as Log Analytics and Azure Policies. Additionally, as a demo during the Azure technical keynote, an instance of the Azure SQL DB service was actually pushed down to one of the on-premises VMs, just as if it was another Azure resource!

If you’d like more details on this feature, we’ll be putting together some content in the near future. In the meantime check out the video below that’s been put out by the Azure Advocacy Team.

[embedded content]

Project Cortex

Another noteworthy announcement from the keynotes today was Project Cortex. The stated goal of Project Cortex is to take all of the data that resides in an organization’s Office 365 tenant and create usable knowledge out of it by leveraging AI. For example, you set a meeting with a co-worker. This feature would see that you’ve set a meeting with your co-worker and would do things like show you the last couple of documents you shared with that person. Or, another example would be topic centers and knowledge centers. These assets are automatically created and updated by AI and present applicable information to people in various office applications. The whole goal is to more readily present the information you need when you need it. More preliminary information on Project Cortex can be found here.

Azure Quantum

A year or two back at a previous Microsoft Ignite, Microsoft showed off some enhancements in the realm of quantum computing. If you follow the news in quantum computing, you likely know that it’s a radically different computing model and will fundamentally alter the tech scene. Azure Quantum is a collection of quantum services from Microsoft and a number of other vendors that is designed to enable open access to quantum resources. The other thing regarding this service that’s nice to see is that it’s being done with openness in mind. Open source is a key design decision and will help benefit everyone involved in bringing quantum computing to the masses in the coming years.

If you’re interested in finding out a bit more about these new features, Microsoft has a website now setup for Azure Quantum here.

Additional Options for the Intelligent Edge and Azure Stack

As you likely know from previous years, Azure Stack was designed to bring a purpose-built appliance with Azure services to your datacenter. It’s seen widespread adoption since its release, especially in the enterprise and service provider space. However, Microsoft sees edge use cases (think branch offices and remote locations) as a key area where Azure stack can be leveraged extensively. With this in mind, Microsoft has created several different variants of Azure Stack for a vast swath of compute use cases. These use-cases ranged from branch office retail stores with a small Azure Stack Edge Device in their building, to search and rescue teams needing to process drone data, from “Ruggedized” Azure Stack instances in their backpacks!

We’ll be talking more about Azure Stack in the coming weeks and months, but if you’re interested in more information regarding how all these options fit together, Microsoft has updated its Azure Stack product site with many of these new offerings.

Interview with Jeffery Snover

[embedded content]

Further Information

The announcements mentioned here are simply the tip of the iceberg. There were several other smaller announcements that came out of Microsoft today, including things like new VM sizes in Azure IaaS, a new Performance Monitor application, and GA of Windows Admin Center 1910. We’ll be bringing you updated information on these topics as the week progresses, and also be on the lookout for blogs on each of these areas on the Altaro blogs in the near future!

Stay tuned for more information!

Go to Original Article
Author: Andy Syrewicze

How to manage Server Core with PowerShell

After you first install Windows Server 2019 and reboot, you might find something unexpected: a command prompt.

While you’re sure you didn’t select the Server Core option, Microsoft now makes it the default Windows Server OS deployment for its smaller attack surface and lower system requirements. While you might remember DOS commands, those are only going to get you so far. To deploy and manage Server Core, you need to build your familiarity with PowerShell to operate this headless flavor of Windows Server.

To help you on your way, you will want to build your knowledge of PowerShell and might start with the PowerShell integrated scripting environment (ISE). PowerShell ISE offers a wealth of features for the novice PowerShell user, including auto complete of commands to context-colored commands to step you through the scripting process. The problem is PowerShell ISE requires a GUI or the “full” Windows Server. To manage Server Core, you have the command window and PowerShell in its raw form.

Start with the PowerShell basics

To start, type in powershell to get into the environment, denoted by the PS before the C: prompt. A few basic DOS commands will work, but PowerShell is a different language. Before you can add features and roles, you need to set your IP and domain. It can be done in PowerShell, but this is laborious and requires a fair amount of typing. Instead, we can take a shortcut and use sconfig to compete the setup. After that, we can use PowerShell for additional administrative work.

PowerShell uses a verb-noun format, called cmdlets, for its commands, such as Install-WindowsFeature or Get-Help. The verbs have predefined categories that are generally clear on their function. Some examples of PowerShell cmdlets are:

  • Install: Use this PowerShell verb to install software or some resource to a location or initialize an install process. This would typically be done to install a windows feature such as Dynamic Host Configuration Protocol (DHCP).
  • Set: This verb modifies existing settings in Windows resources, such as adjusting networking or other existing settings. It also works to create the resource if it did not already exist.
  • Add: Use this verb to add a resource or setting to an existing feature or role. For example, this could be used to add a scope onto the newly installed DHCP service.
  • Get: This is a resource retriever for data or contents of a resource. You could use Get to present the resolution of the display and then use Set to change it.

To install DHCP to a Server Core deployment with PowerShell, use the following commands.

Install the service:

Install-WindowsFeature –name 'dhcp'

Add a scope for DHCP:

Add-DhcpServerV4Scope –name "Office" –StartingRange 192.168.1.100 -EndRange 192.168.1.200 -SubnetMask 255.255.255.0

Set the lease time:

Set-DHCPSet-DhcpServerv4Scope -ScopeId 192.168.1.100 -LeaseDuration 1.00:00:00

Check the DHCP IPv4 scope:

Get-DhcpServerv4Scope

Additional pointers for PowerShell newcomers

Each command has a purpose and means you have to know the syntax, which is the hardest part of learning PowerShell. Not knowing what you’re looking for can be very frustrating, but there is help. The Get-Help displays the related commands for use with that function or role.

Part of the trouble for new PowerShell users is this can still be overwhelming to memorize all the commands, but there is a shortcut. As you start to type a command, the tab key auto-completes the PowerShell commands. For example, if you type Get-Help R and press the tab key, PowerShell will cycle through the commands, such as the command Remove-DHCPServerInDC, see Figure 1. When you find the command you want and hit enter, PowerShell presents additional information for using that command. Get-Help even supports wildcards, so you could type Get-Help *dhcp* to get results for commands that contain that phrase.

Get-Help command
Figure 1. Use the Get-Help command to see the syntax used with a particular PowerShell cmdlet.

The tab function in PowerShell is a savior. While this approach is a little clumsy, it is a valuable asset in a pinch due to the sheer number of commands to remember. For example, a base install of Windows 10 includes Windows PowerShell 5.1 which features more than 1,500 cmdlets. As you install additional PowerShell modules, you make more cmdlets available.

There are many PowerShell books, but do you really need them? There are extensive libraries of PowerShell code that are free to manipulate and use. Even walking through a Microsoft wizard gives the option to create the PowerShell code for the wizard you just ran. As you learn where to find PowerShell code, it becomes less of a process to write a script from scratch but more of a modification of existing code. You don’t have to be an expert; you just need to know how to manipulate the proper fields and areas.

Outside of typos, the biggest stumbling block for most beginners is not reading the screen. PowerShell does a mixed job with its error messages. The type is red when something doesn’t work, and PowerShell will give the line and character where the error occurred.

In the example in Figure 2, PowerShell threw an error due to the extra letter s at the end of the command Get-WindowsFeature. The system didn’t recognize the command, so it tagged the entire command rather than the individual letter, which can be frustrating for beginners.

PowerShell error message
Figure 2. When working with PowerShell on the command line, you don’t get precise locations of where an error occurred if you have a typo in a cmdlet name.

The key is to review your code closely, then review it again. If the command doesn’t work, you have to fix it to move forward. It helps to stop and take a deep breath, then slowly reread the code. Copying and pasting a script from the web isn’t foolproof and can introduce an error. With some time and patience, and some fundamental PowerShell knowledge of the commands, you can get moving with it a lot quicker than you might have thought.

Go to Original Article
Author: