Tag Archives: Cloud

Microsoft to deliver intelligent cloud from Norway datacenters | Stories

Microsoft Cloud to accelerate digital transformation and innovation through a strategic partnership with Equinor and to the benefit of organizations across Norway

REDMOND, Wash., and OSLO, Norway — June 20, 2018 — Microsoft Corp. on Wednesday announced plans to further expand its significant and growing investment in cloud computing in Europe by delivering the intelligent Microsoft Cloud from two new datacenter regions in Norway: one in the greater Stavanger region and the other in Oslo.

The Microsoft Cloud, comprising Microsoft Azure, Office 365 and Dynamics 365, will offer enterprise-grade reliability and performance with data residency from new datacenter locations. Initial availability of Azure is planned for late 2019 with Office 365 and Dynamics 365 to follow. Microsoft has deep expertise protecting data, championing privacy, and empowering customers around the globe to meet extensive security and privacy requirements with Microsoft’s Trusted Cloud principles and the broadest set of compliance certifications and attestations in the industry.

“Over a billion customers around the world trust the intelligent Microsoft Cloud to provide a platform to help transform their businesses,” said Jason Zander, executive vice president, Microsoft Azure, Microsoft. “By delivering the Microsoft Cloud from new datacenter regions in Norway, organizations will be empowered through cloud-scale innovation while meeting their data residency, security and compliance needs.”

Equinor, an international energy company, has chosen the Microsoft Cloud in Norway to enable its digital transformation and drive cloud-enabled innovation. The strategic partnership is supporting Equinor’s digital journey through a seven-year consumption and development agreement valued in the hundreds of millions of dollars (USD). Leveraging the cloud is a prerequisite for the energy industry’s transformation toward a digital future, and secure, reliable and cost-efficient operations are a requirement for Equinor’s adaptation of the cloud.

“Equinor plays a central role in stimulating innovation and advancement of the Norwegian economy, and we are deeply honored to be partnering with them to help take their business into its next stage of growth through the intelligent Microsoft Cloud,” said Kimberly Lein-Mathisen, general manager, Microsoft Norway. “By bringing these new datacenters online in Norway, we are also very pleased to be able to pave the way for growth and transformation of many other businesses and organizations in Norway, whether they be large enterprises, government bodies, or any of the 200,000 small and medium-size businesses that create Norway’s thriving economy.”

Torbjørn Røe Isaksen, Norwegian minister of Trade and Industry said, “The Norwegian government is deeply committed to helping Norway thrive as a hub for digital innovation. Norway needs new industries that create jobs and boost economic growth. In February 2018 the Norwegian government released its datacenter strategy ‘Powered by Nature,’ establishing that attracting datacenters and international investments is an important part of our industrial policy. Therefore, we are very pleased to see Microsoft’s commitment to our country with this new datacenter. We believe that datacenters and cloud services will help ensure the competitiveness and productivity of Norwegian businesses and government institutions, and have a positive impact on our responsibility to our citizens to create an inclusive working life, to the environment, and to our economic development and job growth.”

The delivery of cloud services from Norway expands on Microsoft’s existing investments having operated in the country since 1990 with nearly 600 people working in offices in Lysaker, Oslo, Trondheim and Tromsø across sales, marketing and development, and a network of more than 1,700 partners. This new investment is the first time Microsoft will deliver the intelligent Microsoft Cloud from datacenters located in Norway and is expected to enable greater innovation for oil and gas and other industries, as well as the public sector.

Extending the value of the Microsoft Cloud regions for Norway, customers can also take advantage of hybrid cloud options with Microsoft Azure Stack. Available through service providers in the region, Azure Stack enables customers to develop solutions that harness the power of consistency between Azure and Azure Stack to cater to unique connectivity and compliance needs.

Microsoft has been rapidly expanding to meet an intensifying customer demand for cloud services. By investing in local infrastructure, Microsoft’s intelligent cloud services help companies innovate in their industries and move their businesses to the cloud while meeting data residency, security and compliance needs. Microsoft also has a long history of collaborating with customers to navigate evolving business needs and has developed strategies to help customers prepare for the new European Union General Data Protection Regulation (GDPR). We have invested to make the Microsoft Cloud GDPR compliant, are delivering innovation that accelerates GDPR compliance, and have built a community of experts to help customers along their full GDPR journey.

Office 365 and Dynamics 365 continue to expand the data residency options for customers with 18 geographies announced. The two products are the only productivity and business application platforms that can offer in-geo data residency across such a broad set of locations. Each datacenter geography delivers a consistent experience, backed by robust policies, controls and systems to help keep data safe and help comply with local and regional regulations.

Over the past three years, the number of Azure regions available has more than doubled. Azure has more regions than any other cloud provider with 52 regions announced across the globe.

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:
Microsoft Media Relations, WE Communications for Microsoft, (425) 638-7777, rrt@we-worldwide.com

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

For Sale – Apple Time Capsule 1TB 3G (A1355)

Have a good condition 1TB Apple Time capsule A1355 as have now moved to the cloud.

Boxed (2TB box).

Asking for £50 delivered.

Lmk
Shye

Price and currency: 50
Delivery: Delivery cost is included within my country
Payment method: BT
Location: North London N22
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Archive360 mixes and matches cloud data archiving elements

Archive360 is expanding its cloud data archiving and management platform, but also making it more selective.

The vendor’s Archive2Azure is adding several different elements for customers to archive, including Salesforce and media.

At the same time, it’s moving from a one-size-fits-all platform to a mix-and-match product. That change will lower the cost and reduce complexity for customers, said Bill Tolson, vice president of marketing at Archive360, based in New York. In adding the new elements, it made sense for Archive360 to restructure the platform.

“You only use what you need to use,” Tolson said.

The Archive2Azure management platform stores data in its original format. It maintains data in the customer’s own Azure tenancy.

Archive360 now offers eight cloud data archiving modules: databases, files, healthcare, legacy applications, media, messaging, Salesforce and SharePoint. The vendor previously offered just the file and messaging capabilities.

“In each one of the modules, there is competition,” said George Crump, founder and president of analysis firm Storage Switzerland. “But there’s nobody that I know of that can do it all into one system.”

For example, HubStor, which Tolson labeled one of Archive360’s closest competitors, offers archiving for file storage in Azure.

Drilling down into the archiving advantages

Two of the big benefits to the platform are a reduction in the cost of primary storage and the ability to incrementally add space for data in the cloud, Crump said.

Organizations often have to keep old data from a variety of the modules for compliance or regulatory purposes.

The European Union’s General Data Protection Regulation was a big driver for the updates, Tolson said. GDPR includes a number of new rules regarding customers’ data rights, privacy and protection.

“We’re consolidating and managing a lot of data in a single repository, which makes it easier to respond to GDPR,” Tolson said.

Salesforce users can buy additional storage from the software-as-a-service vendor, but there’s more value in archiving, Tolson said. With Archive2Azure, customers can archive, manage and protect old Salesforce content, minimizing the risk of data deletion.

“Salesforce doesn’t include any archiving capability at all,” Tolson said.

Crump said there are few vendors that can archive Salesforce data, and he doesn’t know of any that migrate it to Azure.

For media archiving, Archive360 platform users can transcribe, index and search audio and video files. Using machine learning, the platform understands and translates content into more than 50 languages, which makes searching faster and more accurate.

We’re consolidating and managing a lot of data in a single repository, which makes it easier to respond to GDPR.
Bill Tolsonvice president of marketing, Archive360

Other new capabilities include the following:

  • archiving of old database applications to save money and space and improve performance;
  • information management and archiving of patient healthcare content, including imaging, for consolidation and security;
  • archiving and management of structured and unstructured legacy data in a searchable cloud platform; and
  • management, analysis and protection of SharePoint data.

The healthcare module has market potential, Crump said. While many think healthcare organizations don’t want to use the cloud, Crump said he has found the opposite to be true.

“They don’t want to store this massive [amount of] data on premises,” he said.

Images in healthcare, for example, have grown in size considerably in just the last few years. In addition, for security and privacy benefits, Archive360 does not see any of the data archived, Crump said.

Logging the logistics

There’s a small architecture difference from the previous version of Archive2Azure, so previous customers won’t shift over automatically. Tolson said some, especially financial customers, will want the new platform, but others won’t want to mess with what they have.

The updates became generally available on Monday. Archive360 had been working with beta customers for about six months.

Archive2Azure is subscription-based. The basic platform starts at $500 per month and rises based on total data under management.

Each cloud data archiving element is priced separately, and the cost is based on capacity, starting as low as $1 per terabyte, per month.

Crump cautioned that licensing and calculating costs could get complex. He noted that customers need to factor in the price of their Azure storage in addition to the Archive2Azure cost.

Archive360, which was founded in 2011, claims about 1,000 customers.

Unchecked cloud IoT costs can quickly spiral upward

The convergence of IoT and cloud computing can tantalize enterprises that want to delve into new technology, but it’s potentially a very pricey proposition.

Public cloud providers have pushed heavily into IoT, positioning themselves as a hub for much of the storage and analysis of data collected by these connected devices. Managed services from AWS, Microsoft Azure and others make IoT easy to initiate, but users who don’t properly configure their workloads quickly encounter runaway IoT costs.

Cost overruns on public cloud deployments are nothing new, despite lingering perceptions that these platforms are always a cheaper alternative to private data centers. But IoT architectures are particularly sensitive to metered billing because of the sheer volume of data they produce. For example, a connected device in a factory setting could generate hundreds of unique streams of data every few milliseconds that record everything from temperatures to acoustics. That much data could add up to a terabyte of data being uploaded daily to cloud storage.

“The amount of data you transmit and store and analyze is potentially infinite,” said Ezra Gottheil, an analyst at Technology Business Research Inc. in Hampton, N.H. “You can measure things however often you want. And if you measure it often, the amount of data grows without bounds.”

Users must also consider networking costs. Most large cloud vendors charge based on communications between the device and their core services. And in typical public cloud fashion, each vendor charges differently for those services.

Predictive analytics reveals, compares IoT costs

To parse the complexity and scale of potential IoT cost considerations, analyst firm 451 Research built a Python simulation and applied predictive analytics to determine costs for 10 million IoT workload configurations. It found Azure was largely the least-expensive option — particularly if resources were purchased in advance — though AWS could be cheaper on deployments with fewer than 20,000 connected devices. It also illuminated how vast pricing complexities hinder straightforward cost comparisons between providers.

In a VM, you have a comparison with dedicated servers from before. But with IoT, it’s a whole new world.
Owen Rogersanalyst, 451 Research

For example, Google charges in terms of data transferred, while AWS and Azure charge against the number of messages sent. Yet, AWS and Azure treat messages differently, which can also affect IoT costs; Microsoft caps the size of a message, potentially requiring a customer to send multiple messages.

There are other unexpected charges, said Owen Rogers, a 451 analyst. Google, for example, charges for ping messages, which check that the connection is kept alive. That ping may only be 64 bytes, but Google rounds up to the kilobyte. So, customers essentially pay for unused capacity.

“Each of these models has nuances, and you only really discover them when you look through the terms and conditions,” Rogers said.

Some of these nuances aim to protect the provider or hide complexity from the users, but users may scratch their heads. Charging discrepancies are endemic to the public cloud, but IoT costs present new challenges for those deciding which cloud to use — especially those who start out with no past experience as a reference point.

“How are you going to say it’s less or more than it was before? At least in a VM, you have a comparison with dedicated servers from before. But with IoT, it’s a whole new world,” Rogers said. “If you want to compare providers, it would be almost impossible to do manually.”

There are many unknowns to building an IoT deployment compared to more traditional applications, some of which apply regardless of whether it’s built on the public cloud or in a private data center. Software asset management can be a huge cost at scale. In the case of a connected factory or building, greater heterogeneity affects time and cost, too.

“Developers really need to understand the environment, and they have to be able to program for that environment,” said Alfonso Velosa, a Gartner analyst. “You would set different protocols, logic rules and processes when you’re in the factory for a robot versus a man[-operated] machine versus the air conditioners.”

Data can also get stale rather quickly and, in some cases, become useless, if it’s not used within seconds. Companies must put policies in place to make sure they understand how frequently to record data and transmit the appropriate amount of data back to the cloud. That includes when to move data from active storage to cold storage and if and when to completely purge those records.

“It’s really sitting down and figuring out, ‘What’s the value of this data, and how much do I want to collect?'” Velosa said. “For a lot of folks, it’s still not clear where that value is.”

Microsoft to deliver intelligent cloud from Norway datacenters | Stories

Microsoft Cloud to accelerate digital transformation and innovation through a strategic partnership with Equinor and to the benefit of organizations across Norway

REDMOND, Wash., and OSLO, Norway — June 20, 2018 — Microsoft Corp. on Wednesday announced plans to further expand its significant and growing investment in cloud computing in Europe by delivering the intelligent Microsoft Cloud from two new datacenter regions in Norway: one in the greater Stavanger region and the other in Oslo.

The Microsoft Cloud, comprising Microsoft Azure, Office 365 and Dynamics 365, will offer enterprise-grade reliability and performance with data residency from new datacenter locations. Initial availability of Azure is planned for late 2019 with Office 365 and Dynamics 365 to follow. Microsoft has deep expertise protecting data, championing privacy, and empowering customers around the globe to meet extensive security and privacy requirements with Microsoft’s Trusted Cloud principles and the broadest set of compliance certifications and attestations in the industry.

“Over a billion customers around the world trust the intelligent Microsoft Cloud to provide a platform to help transform their businesses,” said Jason Zander, executive vice president, Microsoft Azure, Microsoft. “By delivering the Microsoft Cloud from new datacenter regions in Norway, organizations will be empowered through cloud-scale innovation while meeting their data residency, security and compliance needs.”

Equinor, an international energy company, has chosen the Microsoft Cloud in Norway to enable its digital transformation and drive cloud-enabled innovation. The strategic partnership is supporting Equinor’s digital journey through a seven-year consumption and development agreement valued in the hundreds of millions of dollars (USD). Leveraging the cloud is a prerequisite for the energy industry’s transformation toward a digital future, and secure, reliable and cost-efficient operations are a requirement for Equinor’s adaptation of the cloud.

“Equinor plays a central role in stimulating innovation and advancement of the Norwegian economy, and we are deeply honored to be partnering with them to help take their business into its next stage of growth through the intelligent Microsoft Cloud,” said Kimberly Lein-Mathisen, general manager, Microsoft Norway. “By bringing these new datacenters online in Norway, we are also very pleased to be able to pave the way for growth and transformation of many other businesses and organizations in Norway, whether they be large enterprises, government bodies, or any of the 200,000 small and medium-size businesses that create Norway’s thriving economy.”

Torbjørn Røe Isaksen, Norwegian minister of Trade and Industry said, “The Norwegian government is deeply committed to helping Norway thrive as a hub for digital innovation. Norway needs new industries that create jobs and boost economic growth. In February 2018 the Norwegian government released its datacenter strategy ‘Powered by Nature,’ establishing that attracting datacenters and international investments is an important part of our industrial policy. Therefore, we are very pleased to see Microsoft’s commitment to our country with this new datacenter. We believe that datacenters and cloud services will help ensure the competitiveness and productivity of Norwegian businesses and government institutions, and have a positive impact on our responsibility to our citizens to create an inclusive working life, to the environment, and to our economic development and job growth.”

The delivery of cloud services from Norway expands on Microsoft’s existing investments having operated in the country since 1990 with nearly 600 people working in offices in Lysaker, Oslo, Trondheim and Tromsø across sales, marketing and development, and a network of more than 1,700 partners. This new investment is the first time Microsoft will deliver the intelligent Microsoft Cloud from datacenters located in Norway and is expected to enable greater innovation for oil and gas and other industries, as well as the public sector.

Extending the value of the Microsoft Cloud regions for Norway, customers can also take advantage of hybrid cloud options with Microsoft Azure Stack. Available through service providers in the region, Azure Stack enables customers to develop solutions that harness the power of consistency between Azure and Azure Stack to cater to unique connectivity and compliance needs.

Microsoft has been rapidly expanding to meet an intensifying customer demand for cloud services. By investing in local infrastructure, Microsoft’s intelligent cloud services help companies innovate in their industries and move their businesses to the cloud while meeting data residency, security and compliance needs. Microsoft also has a long history of collaborating with customers to navigate evolving business needs and has developed strategies to help customers prepare for the new European Union General Data Protection Regulation (GDPR). We have invested to make the Microsoft Cloud GDPR compliant, are delivering innovation that accelerates GDPR compliance, and have built a community of experts to help customers along their full GDPR journey.

Office 365 and Dynamics 365 continue to expand the data residency options for customers with 18 geographies announced. The two products are the only productivity and business application platforms that can offer in-geo data residency across such a broad set of locations. Each datacenter geography delivers a consistent experience, backed by robust policies, controls and systems to help keep data safe and help comply with local and regional regulations.

Over the past three years, the number of Azure regions available has more than doubled. Azure has more regions than any other cloud provider with 52 regions announced across the globe.

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:
Microsoft Media Relations, WE Communications for Microsoft, (425) 638-7777, rrt@we-worldwide.com

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

Ceph-based cloud block storage service debuts from Atlantic.Net

Atlantic.Net launched a cloud block storage service that uses standard HDDs to keep prices down and an NVMe-based flash cache to boost performance over major disk-based alternatives.

Alantic.Net claims its Secure Block Storage (SBS) performs sequential reads and writes faster than HDD-based block storage from AWS and Microsoft Azure. The hosting provider positions SBS as a choice between HDD and SSD Elastic Block Storage (EBS) options from AWS.

The Atlantic.Net SBS service uses open source, Ceph-based block storage, with a fast nonvolatile memory express SSD cache and inexpensive 7,200 RPM HDDs on the back end. The Ceph system makes three copies of the data, but the customer pays for only one of them, said Marty Puranik, president and CEO of Atlantic.Net.

Atlantic.Net offers 50 GB free for one year from the date the customer signs up for the service and charges 7.9 cents for each additional gigabyte per month. The 50 GB promotional offer is available to current customers through June 15, 2019.

Cloud block storage pricing

On-demand pricing for block storage in the U.S. was 5.5 cents per gigabyte, per month, in the second quarter, according to 451 Research’s Cloud Price Index. The index is based on the cheapest available option for a provider to fulfill a “small basket” specification of two web and two application servers, with 150 GB and 300 GB of attached storage, internet and inter-data center bandwidth for data ingress and egress, and round-the-clock support. The cloud providers generally use HDDs or SSDs for the block storage.

Atlantic.Net’s pricing is competitive against SSD-backed cloud block storage, but less so against HDD-backed block services, according to Deepak Mohan, a research director in IDC’s storage and infrastructure group. Much would depend on the IOPS level the provider can guarantee, he said.

Atlantic.Net offers a service-level agreement for uptime, Puranik said, but not for IOPS — unless the customer contracts for at least a year. The company negotiates IOPS into contracts with those customers and offers discounts based on the block storage quantity and term commitment, he said.

Atlantic.Net’s SBS price and performance level falls between the EBS hard-disk and SSD options, Puranik said. AWS charges 4.5 cents per gigabyte, per month, for its throughput-optimized HDD volumes; it charges 10 cents per gigabyte, per month, for its general-purpose SSD volumes and even more for provisioned IOPS SSD volumes.

“Our service gives better performance than their regular hard drives, and there’s no additional IOPS charge or overages or anything like that. It’s a straight 7.9 cents [per gigabyte] per month,” Puranik said.

Atlantic.Net hosting sites

Founded in 1994, Atlantic.Net started out as a residential internet provider before moving to business-to-business connectivity and web hosting. The company operates out of its own data center in Orlando, Fla., and colocation facilities in New York, Dallas, San Francisco, Toronto and London. Additional options in Ashburn, Va., and Singapore are coming soon, Puranik said.

The SBS service is available only at Atlantic.Net’s Orlando data center, but the company plans to extend it to colocation facilities and offer multisite replication, snapshots and backup, Puranik said.

Atlantic.Net has long offered traditional SAN and NAS storage to customers running applications at its Orlando data center. But, Puranik said, the company wanted a cloud block storage service with a higher level of reliability than the limited-capacity, RAID 10-based local SSD storage Atlantic.Net offered in individual compute servers. The SSDs were overkill in some cases, and customers wanted to scale their storage and move volumes between servers, he said.

Atlantic.Net’s local SSD storage option maxed out at about 2 TB, Puranik said, whereas the SBS service enables as much as 16 TB per volume and eight volumes per server.

Atlantic.Net runs the latest Ceph release with the new BlueStore storage back end that improves performance, supports built-in data compression and provides full checksums to verify data integrity.

Atlantic.Net wrote its own control panel to simplify customers’ deployment of open source Ceph, Puranik said. Users see a slider, choose the size of the block storage volume, create it, name it, mount it and point it at the desired server, he said.

“You wouldn’t know Ceph is running on the back end,” Puranik said.

More storage services planned

SBS is Atlantic.Net’s first dedicated public-cloud-based storage service. Puranik said the company would add an object storage service later this year — also based on Ceph — to target unstructured data. Ceph software supports block-, file- and object-based storage, and it scales out through the addition of commodity server nodes.

Puranik said he expects most customers will use the new SBS service to provide cloud block storage for applications that also run on Atlantic.Net’s cloud compute infrastructure.

“Workloads on the block side are more transactional and compute-driven, and if the compute is close to the block storage service, that would make more sense. So, directionally, Atlantic.Net is probably on the right path. But [it] will face a lot of serious competition from someone like an Amazon,” said Amita Potnis, a research manager with IDC’s storage systems program.

Potnis said Atlantic.Net’s London colocation facility could work to its advantage, because there’s considerable room for growth for a niche provider outside of North America, where Amazon dominates.

An Introduction to the Microsoft Hybrid Cloud Concept and Azure Stack

In the last years, so-called “cloud services” have become more and more interesting and some customers are already thinking of going 100% cloud. There are a lot of competing cloud products out there, but is there a universal description of a cloud service? This is what I will address here.

Let’s start with the basics. Since time began (by that I mean “IT history”) we have all been running our own servers in our own datacenters with our own IT employees. A result of this was that you had different servers for your company, all configured individually, and your IT guys had to deal with that high number of servers. This led to a heavy and increasing load on the IT administrators, no time for new services, often they even had no time to update the existing ones to mitigate the risk to be hacked. In parallel, the development teams and management expect IT to behave in an agile fashion which was impossible for them.

Defining Cloud Services

This is not a sustainable model and is where the cloud comes in. A cloud is a highly optimized standard service (out of the box) without any small changes in the configuration. Cloud Services provide a way to just use a service (compared to power from the power plug) with a predefined and guaranteed SLA (service level agreement). If the SLA breaks, you as the customer would even get money back. The issue with these services is that these servers need to run in a highly standardized setup, in highly standardized datacenters, which are geo-redundant around the world. When it comes to Azure, these datacenters are being run in so-called “regions” with a minimum of three datacenters per region.

In addition to this, Microsoft runs their own backbone (not the internet) to provide a high quality of services. Let’s say available bandwidth meets Quality of Services (QoS).

To say it in one sentence, a cloud service is a highly standardized IT service with guaranteed SLAs running in public datacenters available from everywhere around the world at high quality. In general, from the financial point of view, you pay it per user, services or other flexible unit and you could increase or decrease it, based on your current needs.

Cloud Services – your options

If you want to invest in cloud services, you will have to choose between:

  • A private Cloud
  • A public Cloud
  • A hybrid Cloud

A private cloud contains IT services provided by your internal IT team, but in a manner, you could even get as external service. It is being provided by your datacenter and only hosts services for your company or company group. This means you will have to provide the required SLA.

A public cloud describes IT services provided by a hosting service provider with a guaranteed SLA. The services are being provided by public datacenters and they are not being spun up individually just for you.

A hybrid cloud is a mixture between a public and a private cloud, or in other words “a hybrid cloud is an internet-connected private cloud with services that are being consumed as public cloud services”. Hybrid Cloud deployments can be especially useful if there is a reason not to move a service to a public cloud such as:

  • Intellectual property needs to be saved on company-owned dedicated services
  • Highly sensitive data (e.g. health care) is not allowed to be saved on public services
  • Lack of connectivity could break the public cloud if you are in a region with poor connectivity

Responsibility for Cloud Services

If you decide to go with public cloud services, the question is always how many of your network services are you willing to move to the public cloud?

The general answer should be the more services you can transfer to the cloud, the better your result. However, even the best-laid plans sometimes can be at the mercy of your internet connectivity as well, which can cut you off from these services if not planned for. Additionally, industry regulations have made a 100% cloud footprint difficult for some organizations. The hybrid solution is then the most practical option for the majority of business applications.

Hybrid Cloud Scenarios

These reasons drove the decision by Microsoft to provide Azure to you for your own datacenter in a packaged solution based on the same technology as within Azure. Azure itself has the main concept of working with REST-Endpoints and ARM templates (JSON files with declarative definitions for services). Additionally, Microsoft deemed that this on-premises Azure solution should not provide only IaaS, it should be able to run PaaS, too. Just like the public Azure cloud.

This basically means, that for a service to become available in this new on-prem “Azure Stack”, it must already be generally available (GA) in public Azure.

This solution is called “Azure Stack” and comes on certified hardware only. This makes sure, that you as the customer will get performance, reliability and scalability. That ones you expect from Azure will be with Azure Stack, too.

As of today, the following Hardware OEMs part of this initiative:

  • DELL
  • HPE
  • Lenovo
  • Cisco
  • Huawei
  • Intel/Wortmann
  • Fujitsu

The following services are available with Azure Stack today, but as it is an agile product from Microsoft, we will expect MANY interesting updates in the future.

With Azure Stack, Microsoft provides a simple way to spread services between on-premise and in the public cloud. Possible scenarios could be:

  • Disconnected scenarios (Azure Stack in planes or ships)
  • Azure Stack as your development environment for Azure
  • Low latency computing
  • Hosting Platform for MSPs
  • And many more

As we all know, IT is hybrid today in most of the industries all over the world. With the combination of Azure Stack and Azure, you will have the chance to fulfill the requirements and set up a unique cloud model for all of your company services.

Summary

As you have seen, Azure Stack brings public Azure to your datacenter with the same administration and configuration models you already know from public Azure. There is no need to learn twice. Training costs go down, the standardization gives more flexibility and puts fewer loads on the local IT Admins which gives them time to work on new solutions for better quality. Also, with cloud style licensing things becomes less complex, as things are simply based on a usage model. You could even link your Azure Stack licenses directly to an Azure Subscription.

As hybrid cloud services are the future for the next 10 years or even more, Azure and Azure Stack together can make your IT world the most successful that it ever was in the last 10 years and moving forward.

If you want to learn more about Azure stack, watch our webinar Future-proofing your Datacenter with Microsoft Azure Stack

How about you? Does your organization have interest in Azure Stack? Why or why not? We here on the Altaro Blog are interested! Let us know in the comments section below!

Thanks for reading!

Hybrid cloud security architecture requires rethinking

Cloud security isn’t for the squeamish. Protecting cloud-based workloads and designing a hybrid cloud security architecture has become a more difficult challenge than first envisioned, said Jon Oltsik, an analyst at Enterprise Strategy Group in Milford, Mass.

“The goal was simple,” he said. Enterprises wanted the same security they had for their internal workloads to be extended to the cloud.

But using existing security apps didn’t work out so well. In response, enterprises tried to concoct their own, but that meant the majority of companies had separate security foundations for their on-premises and cloud workloads, Oltsik said.

The answer in creating a robust hybrid cloud security architecture is central policy management, where all workloads are tracked, policies and rules applied and networking components displayed in a centralized console. Firewall and security vendors are beginning to roll out products supporting this strategy, Oltsik said, but it’s still incumbent upon CISOs to proceed carefully.

“The move to central network security policy management is a virtual certainty, but which vendors win or lose in this transition remains to be seen.”

Read the rest of what Oltsik had to say about centralized cloud security.

User experience management undergoing a shift

User experience management, or UEM, is a more complex concept than you may realize.

Dennis Drogseth, an analyst at Enterprise Management Associates in Boulder, Colo., described the metamorphosis of UEM, debunking the notion that the methodology is merely a subset of application performance management.

Instead, Drogseth said, UEM is multifaceted, encompassing application performance, business impact, change management, design, user productivity and service usage.

According to EMA research, over the last three years the two most important areas for UEM is application performance and portfolio planning and optimization. Valuable insights can be provided by UEM to assist both IT and business.

One question surrounding UEM is whether it falls into the realm of IT or business. In years past EMA data suggested 20% of networking staffers considered UEM a business concern, 21% an IT concern and 59% said UEM should be equally an IT and business concern. Drogseth agreed wholeheartedly with the latter group.

Drogseth expanded on the usefulness of UEM in his blog, including how UEM is important to DevOps and creating an integrated business strategy.

Mixed LPWAN results, but future could be bright

GlobalData analyst Kitty Weldon examined the evolving low-power WAN market in the wake of the 2018 annual conference in London.

Mobile operators built out their networks for LPWAN in 2017, Weldon said,  and are now starting to look for action. Essentially every internet of things (IoT) service hopped on the LPWAN bandwagon; now they await the results.

So far, there have been 48 launches by 26 operators.

The current expectation remains lowered costs and improved battery life will eventually usher in thousands of new low-bandwidth IoT devices connecting to LPWANs. However, Weldon notes that it’s still the beginning of the LPWAN era, and right now feelings are mixed.

“Clearly, there is some concern in the industry that the anticipated massive uptake of LPWANs will not be realized as easily as they had hoped, but the rollouts continue and optimism remains, tempered with realistic concerns about how best to monetize the investments.”

Read more of what Weldon had to say here.

Google adds single-tenant VMs for compliance, license cares

Google’s latest VM runs counter to standard public cloud frameworks, but its added flexibility checks off another box for enterprise clients.

Google Cloud customers can now access sole-tenant nodes on Google Compute Engine. The benefits for these single-tenant VMs, currently in beta, are threefold: They reduce the “noisy neighbor” issue that can arise on shared servers; add another layer of security, particularly for users with data residency concerns; and make it easier to migrate certain on-premises workloads with stringent licensing restrictions.

The public cloud model was built on the concept of multi-tenancy, which allows providers to squeeze more than one account onto the same physical host, and thus operate at economies of scale. Early customers happily waived some of those advantages of dedicated hardware in exchange for less infrastructure management and the ability to quickly scale out.

But as more traditional corporations adopt public cloud, providers have added isolation capabilities to approximate what’s inside enterprises’ own data centers, such as private networks, virtual private clouds and bare-metal servers. Single tenancy applies that approach down to the hardware level, while maintaining a virtualized architecture. AWS was the first to offer single-tenant VMs with its Dedicated Instances.

Customers access Google’s single-tenant VMs the same way as its other compute instances, except they’re placed on a dedicated server. The location of that node is either auto-selected through a placement algorithm, or customers can manually select the location at launch. These instances are customizable in size, and are charged per second for vCPU and system memory, as well as a 10% sole-tenancy premium.

Single-tenant VMs another step for Google Cloud’s enterprise appeal

Google still lags behind AWS and Microsoft Azure in public cloud capabilities, but it has added services and support in recent months to shake its image as a cloud valued solely for its engineering. Google must expand its enterprise customer base, especially with large organizations in which multiple stakeholders sign off on use of a particular cloud, said Fernando Montenegro, a 451 Research analyst.

Not all companies will pay the premium for this functionality, but it could be critical to those with compliance concerns, including those that must prove they’re on dedicated hardware in a specific location. For example, a DevOps team may want to build a CI/CD pipeline that releases into production, but a risk-averse security team might have some trepidations. With sole tenancy, that DevOps team has flexibility to spin up and down, while the security team can sign off on it because it meets some internal or external requirement.

“I can see security people being happy that, we can meet our DevOps team halfway, so they can have their DevOps cake and we can have our security compliance cake, too,” Montenegro said.

I can see security people being happy … our DevOps team … can have their DevOps cake and we can have our security compliance cake, too.
Fernando Montenegroanalyst, 451 Research

A less obvious benefit of dedicated hardware involves the lift and shift of legacy systems to the cloud. A traditional ERP contract may require a specific set of sockets or hosts, and it can be a daunting task to ensure a customer complies with licensing stipulations on a multi-tenant platform because the requirements aren’t tied to the VM.

In a bring-your-own-license scenario, these dedicated hosts can optimize customers’ license spending and reduce the cost to run those systems on a public cloud, said Deepak Mohan, an IDC analyst.

“This is certainly an important feature from an enterprise app migration perspective, where security and licensing are often top priority considerations when moving to cloud,” he said.

The noisy neighbor problem arises when a user is concerned that high CPU or IO usage by another VM on the same server will impact the performance of its own application, Mohan said.

“One of the interesting customer examples I heard was a latency-sensitive function that needed to compute and send the response within as short a duration as possible,” he said. “They used dedicated hosts on AWS because they could control resource usage on the server.”

Still, don’t expect this to be the type of feature that a ton of users rush to implement.

“[A single-tenant VM] is most useful where strict compliance/governance is required, and you need it in the public cloud,” said Abhi Dugar, an IDC analyst. “If operating under such strict criteria, it is likely easier to just keep it on prem, so I think it’s a relatively niche use case to put dedicated instances in the cloud.”

Posting passwords on Trello leads to latest data exposure mess

Data exposures in web applications and cloud services are becoming more in fashion these days, and Trello is the latest service being used poorly in the enterprise.

According to an investigation reported by Brian Krebs, Flashpoint security analyst David Shear discovered hundreds of boards exposing data and passwords on Trello from government agencies, healthcare organizations and others. This follows news that large enterprises and government agencies accidentally set Amazon Web Services buckets and Google Groups to public.

By default, Trello boards are either password-protected or visible only to team members. However, it is possible to share boards with anyone on the web, and the contents of those boards can even be indexed by search engines.

Shear found organizations sharing logins and passwords on Trello for corporate WordPress accounts and iPage domain hosting accounts. The Maricopa County Department of Public Health in Phoenix exposed sensitive info, including how to navigate the organization’s payroll system. Even the National Coordinator for Health Information Technology, which is part of the U.S. Department of Health and Human Services, was found to be leaking passwords on Trello.

Many companies don’t realize that they have employees storing confidential or sensitive information on public-facing sites, like Trello.
Justin Jettdirector of audit and compliance for Plixer

James Lerud, head of the behavioral research team at Verodin, based in McLean, Va., called this incident “the latest in a long line of public exposures involving improper handling of credentials.”

“There have been numerous examples of private keys out in the open on GitHub. It’s pretty difficult for companies like Trello or GitHub to prevent this type of exposure; the responsibility lies with the users of these services,” Lerud wrote via email. “Companies [that] use these types of services need to regularly audit what kind of data is being exposed to the public and not rely on a third party to discover problems.”

Justin Jett, director of audit and compliance for Plixer, based in Kennebunk, Maine, said it is “an extremely dangerous practice to store credentials on public-facing sites or directories.”

“Passwords should never be stored in a manner that could be perceived as plaintext. They should be stored in a secure and encrypted environment where data thieves aren’t given a free pass to the data. The number of phishing attacks used to steal users’ credentials continues to grow; we don’t need to make it even easier for the thieves,” Jett wrote via email.

“Many companies don’t realize that they have employees storing confidential or sensitive information on public-facing sites, like Trello. Security teams often don’t have the visibility they need to know when credentials have been compromised until after a data breach,” Jett wrote. “In many cases, it likely comes down to a matter of educating users on best practices. Corporate training can aid the reduction of such blatant exposure of sensitive data.”