Tag Archives: experienced

What are some considerations for a public folders migration?


A public folders migration from one version of Exchange to another can tax the skills of an experienced administrator…

“;
}
});

/**
* remove unnecessary class from ul
*/
$(“#inlineregform”).find( “ul” ).removeClass(“default-list”);

/**
* Replace “errorMessageInput” class with “sign-up-error-msg” class
*/
function renameErrorMsgClass() {
$(“.errorMessageInput”).each(function() {
if ($(this).hasClass(“hidden”)) {
$(this).removeClass(“errorMessageInput hidden”).addClass(“sign-up-error-msg hidden”);
} else {
$(this).removeClass(“errorMessageInput”).addClass(“sign-up-error-msg”);
}
});
}

/**
* when validation function is called, replace “errorMessageInput” with “sign-up-error-msg”
* before return
*/
function validateThis(v, form) {
var validateReturn = urValidation.validate(v, form);
renameErrorMsgClass();
return validateReturn;
}

/**
* DoC pop-up window js – included in moScripts.js which is not included in responsive page
*/
$(“#inlineRegistration”).on(“click”,”a.consentWindow”, function(e) {
window.open(this.href, “Consent”, “width=500,height=600,scrollbars=1”);
e.preventDefault();
});

— but there’s another level of complexity when cloud enters the mix.

A session at last week’s Virtualization Technology Users Group event in Foxborough, Mass. detailed the nuances of Office 365 subscription offerings and the migration challenges administrators face. Microsoft offers a la carte choices for companies that wish to sign up for a single cloud service, such as Exchange Online, and move the messaging platform into the cloud, said Michael Shaw, a solution architect for Office 365 at Whalley Computer Associates in Southwick, Mass., in his presentation.

Microsoft offers newer collaboration services in Office 365, but some IT departments cling to one holdover that the company cannot extinguish — public folders. This popular feature, introduced in 1996 with Exchange 4.0, gives users a shared location to store documents, contacts and calendars.

For companies on Exchange 2013/2016, Microsoft did not offer a way to move “modern” public folders — called “public folder mailboxes” after an architecture change in Exchange 2013 — to Office 365 until March 2017. Prior to that, many organizations either developed their own public folders migration process, used a third-party tool or brought in experts to help with the transition.

Organizations that want to use existing public folders after a switch from on-premises Exchange to Office 365 should be aware of the proper sequence to avoid issues with a public folders migration, Shaw said.

Most importantly, public folders should migrate over last. That’s because mailboxes in Office 365 can access a public folder that is on premises, but a mailbox that is on premises cannot access public folders in the cloud, Shaw said.

“New can always access old, but old can’t access new,” he said.

IT admins should keep in mind, however, that Microsoft dissuades customers from using public folders for document use due to potential issues when multiple people try to work on the same file. Instead, the company steers Office 365 shops to SharePoint Online for document collaboration, and the Groups service for shared calendars and mobile device access.

In another attempt to prevent public folders migration to Office 365, Microsoft caps public folder mailboxes in Exchange Online at 1,000. They also come with a limit of 50 GB per mailbox in the lower subscription levels and a 100 GB quota in the higher E3 and E5 tiers. Public folder storage cannot exceed 50 TB.

Still, support for public folders has no foreseeable end despite Microsoft’s efforts to eradicate the feature. Microsoft did not include public folders in Exchange Server 2007, but reintroduced it in a service pack after significant outcry from customers, Shaw said. Similarly, there was no support for public folders when Microsoft introduced Office 365 in 2011, but it later buckled to customer demand.

Microsoft’s Surface Book 2 really is a beautiful beast

Mashable Choice
highlights the best of everything we cover, have experienced first-hand and would recommend to others.

Https%3a%2f%2fvdist.aws.mashable.com%2fcms%2f2017%2f10%2fdf303bbd 992f 5bab%2fthumb%2f00001

Using a 15-inch, 4.2-pound powerhouse like Microsoft’s new Surface Book 2 takes some time getting used to. It nearly fits into my backpack, but weighs on my back. It’s unwieldy to carry around the office, but has remarkable battery life. It takes up more space on my desk than my other computers, but has workstation-level power.

These are the type of calculations I make as a longtime ultra-light convertible user. My systems of choice typically weigh 2 pounds or less. I know that 12 -and 13.5-inch Microsoft’s Surface Pro and Apple’s MacBook, respectively, have their limits, but I love them for their combination of portability, power and battery life. If I wanted to do more, like edit 4K video, edit high-resolution images, start doing CAD work, program or game my neurons away, I’d consider a portable like this.

The beast at rest.

The beast at rest.

Image: lance ulanoff/mashable

The Surface Book 2 is not a reinvention of the original Surface Book. At a glance and aside from the size, it’s indistinguishable from the Surface Book with Performance Base I have in my office. It has the same magnesium body and similarly designed keyboard and mouse. 

Broadly, the component design is the same. Microsoft puts the CPU in the “Clipboard” PixelSense touch screen and the discrete graphics in the base. You can still detach the screen with the press of a button and signature the dynamic fulcrum hinge that connects the base and screen looks and feels unchanged.

Something’s different

The closer I looked, though, the more I saw evidence of the 1,000-part changes Microsoft Windows Device Lead Panos Panay told me about in October. Many are subtle. For example, even though the keyboard is the exact same size on the Surface Book 2 15-inch, it now sits on a relatively flat plane. The channel that surrounded the original keyboard, nesting it slightly more deeply in the chassis, is gone.

The new Surface Book 2 15-inch (left) next to the 13.5-inch Performance Base model.

The new Surface Book 2 15-inch (left) next to the 13.5-inch Performance Base model.

Image: LANCE ULANOFF/MASHABLE

The screen looks similar, but in addition to a higher resolution 3240 x 2160 (versus 3000 x 2000 on the Surface Book 2 13), the frame that surrounds it is a tad sleeker. Microsoft got rid of the chamfered edge, which basically gives the Clipboard a cleaner look. The front facing camera resolution (5 MP) is unchanged, but now Microsoft hides it and the IR camera (for Windows Hello facial recognition) behind a darker black screen frame.

Panay told me the hinge is completely redesigned, but it looks and works the same as before. Changes are only evident when you detach the screen. There’s a somewhat different and slightly quieter detach click when you hit the Surface Book 2’s detach button on the keyboard. And when I pulled away the 15-inch display, I noticed that, while the trio of digital connectors looked unchanged, the magnets that hold the screen in place are somewhat smaller than those on the original Surface Book.

Https%3a%2f%2fvdist.aws.mashable.com%2fcms%2f2017%2f11%2f150f1f9e 4c00 64ea%2fthumb%2f00001

As you would expect for a workhorse laptop – even a hybrid like this – the Surface Book 2 maintains its supply of ports. There are still two USB 3.1 ports and an SD card slot on the left side of the keyboard. The right side though is home to one of the most important port changes Microsoft has made in the history of the Surface brand. There’s finally a USB-C port. It’s there for data or charging, though you can’t grab just any USB charger. I tried a USB-C cable plugged into a Samsung charger and the system informed me that the computer wasn’t charging and that I should use a recommended USB-C charger.

No big changes to the ports on the left side (USB got upgraded from 3.0 to 3.1).

No big changes to the ports on the left side (USB got upgraded from 3.0 to 3.1).

Image: LANCE ULANOFF/MASHABLE

Goodbye DisplayPort, hello USB-C.

Goodbye DisplayPort, hello USB-C.

Image: LANCE ULANOFF/MASHABLE

The introduction of USB-C also means the loss of a DisplayPort. I have numerous adapters that convert, for instance, VGA to DisplayPort and HDMI to DisplayPort, but, sadly, no display adapters that terminate in USB-C.

Next to that new port is Microsoft’s propriety Surface Connector power/data port. It connects to the same size Surface Connector plug as any other Surface device, however, the Surface Book 2 power cable is much thicker and the power brick much larger than any Surface Book or Pro power source before it. Just something to keep in mind if you plan on doing a lot of traveling with this behemoth.

Hardcore insides

Obviously, the Surface Book 2’s biggest changes are the ones I can’t see.

My $3,299 test system is packed with 1TB of storage and 16GB of RAM.

There are myriad tiny differences between the last Surface Book and the new model. Among them is the redesigned base vents seen here on the Surface Book 2.

There are myriad tiny differences between the last Surface Book and the new model. Among them is the redesigned base vents seen here on the Surface Book 2.

Image: lance ulanoff/mashable

Surface Book with Performance Base vents

Surface Book with Performance Base vents

Image: lance ulanoff/mashable

Inside the PixelSense display is Intel’s 8th Generation Core i7 CPU. The fan-less design means that, for many processor-intensive applications, the Surface Book 2 is whisper quiet. However, like the Surface Book with Performance Base before it, there’s also resource intensive silicon is in the base. The 15-inch Surface Book 2 features a powerful Nvidia GeForce GTX 1060 with 6 GB of RAM. When I started running 3D operations and the Mixed Reality viewer in Windows 10 Fall Creators Edition, the base fan spun up. The good news is that it’s quieter than the fan in the Surface Book 2 Performance Base.

The new Surface Book 2 handily beats the Performance Base model on both single and multi-core scores in Geekbench. The Multi-Core score is nearly double that system. The Surface Book 2’s scores come close to, but do not beat the MacBook Pro Retina 15-inch running an Intel Quad-Core Core i7.

In a vacuum, however, these numbers mean nothing. What matters is how the Surface Book 2 performs the myriad heavy-lift tasks required by modern knowledge workers. I’m pleased to report that across mundane browser tasks, intense Photoshop work, insanely big spreadsheets and entertaining PC gaming operations, the Surface Book 2 didn’t miss a beat. 

Microsoft claims 17-hours of battery life (using the batteries in the base and display). Over the course of two days, myriad tasks, setting brightness to max and not letting the screen timeout, I got roughly 12. I’m certain I could’ve done much better in battery-saver mode, but, regardless, your mileage will surely vary.

Keyboard, Screen, Pen

I’ve been writing the Surface Book 2 review on the Surface Book 2’s excellent keyboard. The keys have substantial travel (roughly 1.5 mm) and response, as well as enough spacing to make touch-typing a breeze. Unlike the huge 7-inch track pad on Apple’s MacBook Pro, the Surface Book 2’s 5-inch touch pad does physically move, but only along the bottom edge where you’ll press for left or right clicks. Thanks to the glass covering, there’s no drag and the track pad was responsive to touches, taps, and gestures like pinch to zoom.

The Surface Book 2 is designed for touch and pen input.

The Surface Book 2 is designed for touch and pen input.

Image: Lance ulanoff/mashable

The biggest difference between Apple’s MacBook line and Microsoft’s Surface Books (aside from the operating system) is that the Surface Book display is a touch screen and a standalone tablet computer. As such, it includes accelerometers and gyroscopes that allow it to measure movement. I can detach the 15-inch screen and play Asphalt Extreme, turning the display back and forth like a steering wheel to control the action on screen.

That’s simply not possible with the MacBook Pro. If you want that kind of functionality from a large-screen Apple device, get an iPad.

In addition, the Surface Book 2 works with the $99 Surface Pen (not included) and Windows 10 is the most pen-friendly desktop (and laptop) OS on the planet. It’s a true pleasure to detach the screen, flip it around, fold it back onto the keyboard, rest my palm on the screen and start drawing on the expansive 15-inch touch display.

The PixelSense Display is still a full-blown tablet in its own right.

Image: LANCE ULANOFF/MASHABLE

The Surface Book 2 base looks a little sad without the display.

The Surface Book 2 base looks a little sad without the display.

Image: LANCE ULANOFF/MASHABLE

I really do love this screen. Visually, it’s an improvement over the last Surface Book and easily as good as anything Apple produced at a similar scale. Obviously, it’s not as thin as a MacBook Pro screen, but then those computers don’t have battery and CPUs inside.

If you create a lot of art, design, Photoshop, CAD or even programming work, and have $3,299, this is the premium workhouse convertible for you. I do think Microsoft should consider throwing in a Surface Pen (come on, guys, even at the base price, people are spending over $2,400 for one of these rigs).

Would I buy it? No. Not because it isn’t excellent, but because I get all the power and performance I need – with a lot less weight — from the equally versatile Microsoft Surface Pro.

Microsoft Surface Book 2

The Good

Sharp design Ample power Excellent and responsive touch screen Decent battery life Quiet operation Tremendously versatile

The Bad

Big screen wobbles a bit when you move it Power and battery life add up to weight

The Bottom Line

Microsoft’s first 15-inch laptop convertible is powerful, attractive, pricey, and ready for anything.

Https%3a%2f%2fvdist.aws.mashable.com%2fcms%2f2017%2f5%2f79fc39c6 0cea 2fd7%2fthumb%2f00001

Never mind the DevOps maturity model, focus on principles

There are few names in DevOps as big as Gary Gruver. He’s an experienced software executive with a knack for implementing continuous release and deployment pipelines in large organizations. In fact, he literally wrote the book on the subject. His latest, Starting and Scaling DevOps in the Enterprise, is an insightful and easy-to-read guide that breaks down DevOps principles by putting them all in a context enterprises can use to gain alignment on their journey to continuous delivery.

Gruver, president of Gruver Consulting, sat down with DevOpsAgenda to discuss the DevOps maturity model, core DevOps maturity principles, and how small and large organizations must take different paths on their DevOps journey.

What’s your take on the DevOps maturity model? Can that stymie DevOps adoption in large organizations?

Gary Gruver: A lot of people come out with these maturity models and say, ‘We had some success in DevOps, and now everybody has to do what we did.’

If you don’t start with the changes that are going to benefit people the most, you’re going to lose the momentum in your transformation.
Gary Gruverpresident at Gruver Consulting

And what I find when I go into different organizations, or even looking at different deployment pipelines within organizations, [is] that the things impacting productivity are fundamentally different. You look at a DevOps maturity model and it might claim, ‘You need to have Infrastructure as code, automated deployment, test automation, and this and that.’ I think that overlooks the actual problem each different deployment pipeline might have.

This is about organizational change management, and it’s about getting people to work in different ways. If you don’t start with the changes that are going to benefit people the most, you’re going to lose the momentum in your transformation.

Therefore, I think it’s important to start with DevOps principles so people can pick the changes that’ll make the biggest difference to them, so they will take ownership for implementing the changes into their organization.

How does scale affect success in DevOps?

Gruver: If you’re a small team and you have four or five developers, then DevOps is about just getting people to embrace and take ownership of code all the way out to the customer. It’s making sure the code is meeting the needs of customers and stable in production. Then, it’s responding to that feedback and taking ownership. It’s a lot about helping these developers become generalists and understanding the operations piece of this puzzle.

But if you have a tightly coupled system that requires thousands of people working together, then there aren’t that many people who are going to know the whole system and be able to support it in production. In these situations, someone needs to be responsible for designing how these complex systems come together and continually improve the process. It is going to require more specialists, because it is hard for everyone to understand the complexities of these large systems. The ways you coordinate five people are a lot different than coordinating a thousand.

Simon Wardley's Pioneer, Settlers and Town Planners model
Maturity models aren’t the only models of use. Organizations can turn to the Pioneers, Settlers and Town Planners model to reach DevOps efficiency in the best, most organic way for them.

What are some of these difficulties applying DevOps practices from small- to large-scale organizations?

Gruver: What I hear a lot of people in large organizations do with DevOps is they look at what the small teams are doing, and they try to replicate that. They try to reproduce and figure out how to make the small-team strategy work in a tightly coupled system, instead of really looking at the issues blocking them from releasing on a more frequent basis.

They’re not asking, ‘What are the ways we can address this waste and inefficiency and take it out of the system so we can release more frequently?’ They figure if they just do what the small teams are doing and try to replicate that and create a DevOps maturity model, by some magic, they’re going to be successful. Instead of doing that, they should focus on principles to figure out what’s going on in their system.

Large organizations should break it down as small as you possibly can, because smaller things are much easier to solve, maintain and manage. So, if you can break your system down into microservices and make that work, those teams are always going to be more efficient. That said, rearchitecting a large, tightly coupled system can be extremely complex and time-consuming, so it is typically not my first choice.

Additionally, there are a lot of DevOps practices that can be successfully implemented in large, tightly coupled systems. In fact, I would argue that applying DevOps principles in these complex systems will provide much greater benefits to the organization just because the inefficiencies associated with coordinating the work across large groups is so much more pronounced than it is with small teams.

AWS management tools: Partners gain visibility with buyers

Cloud consultants experienced in AWS management tools have new avenues to promote their skills. Amazon Web Services recently added four tools to its AWS Service Delivery Program, which the cloud provider unveiled in November 2016.

The program aims to help customers identify AWS Partner Network companies with expertise in specific skill or service areas. The AWS management tools added to the program are AWS CloudFormation, Amazon EC2 Systems Manager, AWS Config and AWS CloudTrail. According to an AWS blog post, 11 companies have been named Management Tools Service Delivery launch partners.

Among those companies is 2nd Watch, a managed public cloud provider based in Seattle. The company has obtained AWS Service Delivery launch partner status for AWS CloudFormation, AWS CloudTrail and AWS Config.

“Customers looking to leverage and understand how to use AWS Config, CloudTrail or CloudFormation will be able to find us [via the AWS partner portal] and understand we have more than just the basic knowledge of these products and how to use them,” said Jeff Aden, executive vice president of strategic business development and marketing at 2nd Watch.

Customers looking to leverage and understand how to use AWS Config, CloudTrail or CloudFormation will be able to find us.
Jeff Adenexecutive vice president of strategic business development and marketing, 2nd Watch

Aden said those three tools are commonly implemented across 2nd Watch’s customer base.

Companies that become an AWS management tools delivery partner can promote their offerings through channels such as the AWS Service Delivery website, the Partner Solutions Finder and the services partner page, according to AWS.

Other launch partners for the AWS management tools delivery designation include Cloudnexa, Cloudreach, Cloudticity, Cognizant, Datapipe, Flux7, Foghorn Consulting, Logicworks, REAN Cloud and Stelligent.

NAYA Tech, NuoDB in reseller pact

NAYA Tech, a database consulting and managed IT services company based in Sunnyvale, Calif., has entered a reseller relationship with NuoDB, a company that provides an elastic SQL database for hybrid cloud applications.

Yair Rozilio, CEO and founder of NAYA Tech, suggested NuoDB’s technology will have applicability across multiple industries, noting that NAYA Tech has an opportunity to offer its services to both enterprise customers and startups. Specifically, the company plans to focus on industries with a pressing need for an elastic, cloud-centric transactional database.

“We’ll focus first on independent software [vendors] offering SaaS solutions that require immense scale, and also expand into other areas such as the financial and communications sectors,” Rozilio said.

Rozilio said NuoDB suits two types of customers for which NAYA Tech can provide services. For customers with existing applications that need greater scale, elasticity or lower cost than their current databases, NAYA Tech offers a migration-as-a-service turnkey solution. In this model, the company takes a customer’s existing database architecture and applications and migrates them to NuoDB. For customers looking for the appropriate database technology to power next-generation data architectures, NAYA Tech offers design and implementation services, Rozilio explained.

NAYA Tech plans to train dozens of its core database technology consultants on NuoDB, Rozilio said. He said those consultants include solutions architects, database administrators and developers experienced with relational and NoSQL database technologies.

Stephen Fahey, senior vice president of sales at NuoDB, based in Cambridge, Mass., said the company has worked with a number of international resellers and consultants, but noted its relationship with NAYA Tech represents its first reseller agreement based in the U.S. 

BackupAssist rolls out ransomware protection

BackupAssist, a Windows server backup and recovery vendor, introduced CryptoSafeGuard, a new ransomware protection product for the SMB market. The company sells its products through resellers and managed service provider partners.

CryptoSafeGuard, which combines with the vendor’s backup software, aims to protect customers from ransomware infections, complementing existing security products such as firewalls “by providing an extra layer of protection to the backup,” said Linus Chang, CEO of BackupAssist.

“We are not trying to replace existing [security] solutions. We are actually providing an extra safety net,” he said.

“Recently, [ransomware] has evolved into a threat for all businesses,” and the threat is growing exponentially every year, said Troy Vertigan, BackupAssist’s vice president of sales and marketing. “Businesses have a higher chance … of being taken down by ransomware more than the risk of a flood or a fire,” he noted.

Customers of the company’s BackupCare subscription service will have access to the CryptoSafeGuard through version 10.1 of BackupAssist’s software.

Hurricane relief for partners

ConnectWise, a company that offers remote monitoring and management, professional services automation and other products for service providers, is raising funds to assist partners affected by Hurricane Harvey. In a blog post, ConnectWise CEO Arnie Bellini said the company aims to “help our partners survive as entrepreneurs and reestablish their successful businesses.” The company plans to raise $750,000 and will match donations to ConnectWise.com/HelpNow.

Other news

  • Datatec Ltd., the parent company of Westcon-Comstor, has closed the sale of the distributor’s North America and Latin America business to Synnex Corp. The deal also includes the sale of a minority share in Westcon-Comstor’s EMEA and Asia-Pacific business to Synnex. Datatec’s plan to sell the Westcon-Comstor assets was disclosed in June 2017.
  • In other acquisition news, VRP Consulting, a Salesforce consulting firm and digital transformation services provider based in San Francisco, has purchased CodeSWAT, a Salesforce consultancy in Santa Clara, Calif.
  • SolarWinds MSP, an IT service management technology provider, revealed it has purchased SpamExperts, an email security company. SpamExperts currently offers services for email archiving, as well as spam and virus filtering. SolarWinds MSP said the acquisition will expand its SolarWinds MSP Mail technology.
  • Data protection company StorageCraft launched a $100,000 Recovery Guarantee for qualified partners. The guarantee covers the recovery of virtual or physical machines, either on premises or in the cloud, according to StorageCraft.
  • The NFL’s Jacksonville Jaguars have tapped TierPoint LLC to provide colocation and other managed data center services. TierPoint in 2015 acquired its Jacksonville, Fla., data center and has since completed an infrastructure and security upgrade, according to the company. Shifting from AFC South to AFC North territory, TierPoint also announced it will build a second data center in Baltimore. The planned 35,000 sq. ft. facility will see an initial investment of more than $10 million.

Market Share is a news roundup published every Friday.

Announcing the public preview of Azure Archive Blob Storage and Blob-Level Tiering

From startups to large organizations, our customers in every industry have experienced exponential growth of their data. A significant amount of this data is rarely accessed but must be stored for a long period of time to meet business continuity and compliance requirements. Examples include employee data, medical records, customer information, financial records, backups, etc. Additionally, recent and coming advances in artificial intelligence and data analytics are unlocking value from data that might have previously been discarded. Customers want to keep more of these data sets for a longer period but need a scalable and cost-effective solution to do so.

Last year, we launched Cool Blob Storage to help customers reduce storage costs by tiering their infrequently accessed data to the Cool tier. Today we’re announcing the public preview of Archive Blob Storage designed to help organizations reduce their storage costs even further by storing rarely accessed data in our lowest-priced tier yet. Furthermore, we’re excited to introduce the public preview of Blob-Level Tiering enabling you to optimize storage costs by easily managing the lifecycle of your data across these tiers at the object level.

The CEO of HubStor, a leading enterprise backup and archiving company, stated: “We are jumping for joy to see the amazing design Microsoft successfully implemented. Azure Archive Blob Storage is indeed an excellent example of Microsoft leapfrogging the competition.”

Azure Archive Blob Storage

Azure Archive Blob storage is designed to provide organizations with a low cost means of delivering durable, highly available, secure cloud storage for rarely accessed data with flexible latency requirements (on the order of hours). See Azure Blob Storage: Hot, cool, and archive tiers to learn more.

The Archive tier, in addition to Hot and Cool access tiers, is now available in Blob Storage accounts. Archive Storage characteristics include:

  • Cost-effectiveness: Archive access tier is our lowest priced storage offering. Customers with long-term storage which is rarely accessed can take advantage of this. For more details on regional preview pricing, see Azure Storage Pricing.
  • Seamless Integration: Customers use the same familiar operations on blobs in the Archive tier as on blobs in the Hot and Cool access tiers. This will enable customers to easily integrate the new access tier into their applications.
  • Availability: The Archive access tier will provide the same 99% availability SLA (at General Availability (GA)) offered by the Cool access tier.
  • Durability: All access tiers including Archive are designed to offer the same high durability that you have come to expect from Azure Storage with the same data replication options available today.
  • Security: All data in the Archive access tier is automatically encrypted at rest.

Blob-Level Tiering:  easily optimize storage costs without moving your data

To simplify data lifecycle management, we now allow customers to tier their data at the blob level.  Customers can easily change the access tier of a blob among the Hot, Cool, or Archive tiers as usage patterns change, without having to move data between accounts. Blobs in all three access tiers can co-exist within the same account.

Flexible management

Archive Storage and Blob-level Tiering will be available on all Blob Storage accounts. For customers with large volumes of data in General Purpose accounts, we will allow upgrading your account to get access to Cool, Archive, and Blob-level Tiering at GA.

A user may access the feature using .NET (see Figure 1), Python (preview), or Node.js client libraries or REST APIs initially. Support for the Java client library and portal (see Figure 2) will roll out over the next week. Other SDKs and tools will be supported in the next few months.

XSCL_white

Figure 1: Set blob access tier using .NET client library

28092A22_v3

Figure 2: Set blob access tier in portal

Pricing

Pricing for Azure Archive Blob Storage during preview will be reduced. Please refer to the Azure Blobs Storage Pricing page for more details.

How to get started

To enroll in the public preview, you will need to submit a request to register this feature to your subscription. After your request is approved (within 1-2 days), any new LRS Blob Storage account you create in US East 2 will have the Archive access tier enabled, and all new accounts in all public regions will have blob-level tiering enabled. During preview, only LRS accounts will be supported but we plan to extend support to GRS and RA-GRS accounts (new and existing) as well at GA. Blob-level tiering will not be supported for any blob with snapshots. As with most previews, this should not be used for production workloads until the feature reaches GA.

To submit a request, run the following PowerShell or CLI commands.

PowerShell

Register-AzureRmProviderFeature -FeatureName AllowArchive -ProviderNamespace Microsoft.Storage

This will return the following response:

FeatureName         ProviderName      RegistrationState 
-----------         ------------      ----------------- 
AllowArchive        Microsoft.Storage   Pending 

It may take 1-2 days to receive approval.  To verify successful registration approval, run the following command:

Get-AzureRmProviderFeature -FeatureName AllowArchive -ProviderNamespace  Microsoft.Storage

If the feature was approved and properly registered, you should receive the following output:

FeatureName         ProviderName      RegistrationState 
-----------         ------------      ----------------- 
AllowArchive        Microsoft.Storage   Registered  

CLI 2.0

az feature register –-namespace Microsoft.Storage –-name AllowArchive

This will return the following response:

{
  "id": "/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/providers/Microsoft.Features/providers/Microsoft.Storage/features/AllowArchive",
  "name": "Microsoft.Storage/AllowArchive",
  "properties": {
    "state": "Pending"
  },
  "type": "Microsoft.Features/providers/features"
}

It may take 1-2 days to receive approval.  To verify successful registration approval, run the following command:

-az feature show –-namespace Microsoft.Storage –-name AllowArchive

If the feature was approved and properly registered, you should receive the following output:

{
  "id": "/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/providers/Microsoft.Features/providers/Microsoft.Storage/features/AllowArchive",
  "name": "Microsoft.Storage/AllowArchive",
  "properties": {
    "state": "Registered"
  },
  "type": "Microsoft.Features/providers/features"
}

Get it, use it, and tell us about it

We’re confident that Azure Archive Blob Storage will provide another critical element for optimizing your organization’s cloud data storage strategy. As this is a preview, we look forward to hearing your feedback on these features, which you can send by email to us at archivefeedback@microsoft.com.