Tag Archives: since

For Sale – HP Z1 Professional All-In-One Workstation

HP Z1 G1 professional workstation. I’ve owned this since new – it was purchased in 2013 and is only up for sale due to recent upgrade to 2 monitor setup.

All-in-one design with 27″ IPS panel. Absolutely mint condition with all original packaging, HP system recovery disks plus un-used and boxed keyboard and mouse.

I upgraded the RAM at the time of purchase to 32gb ECC memory – that memory type will now set you back in excess of £300 alone. It came pre-installed with Windows 7 Pro x64 but was entitled to a Windows 10 upgrade. I’ve stuck a brand new 120gb SSD in it and installed a clean copy of Win 10 Pro that is fully activated by Microsoft.

Fantastic modular design – the case opens on gas struts for easy tool free access to main components.

Motherboard was replaced by HP under warranty early last year as the sound card had failed.

This is a beast of a workstation for professional use – it flies with an Intel server spec processor and the onboard RAM. Full specs are: –

Xeon E3-1280 3.5 GHz quad core processor.
32gb ECC RAM
Dedicated Graphics Card – NVidia Quadro 1000M
27″ IPS display (non-touch screen)
Optical Drive – DVD±RW (±R DL) / DVD-RAM
120gb SSD plus space in modular caddy for additional 2.5″ HDD
Intel gigabit NIC with RJ-45 connector
Wireless network adaptor on board
Bluetooth
HP integrated web cam
Quality onboard speakers built into the base of the chassis.
4 x USB 2 ports on rear
2 x USB 3 ports on side
IEEE-1394a FireWire connector
Media – xD/MMC/MS/SD card reader
Optical S/PDIF audio output
DisplayPort connector for additional external monitor
Full set of audio in/outputs including subwoofer.

It is a fantastic piece of kit looking for a new home.

I am based in West Sussex and I’m happy to meet up for delivery within a 30 mile radius of Horsham. If you want to arrange for your own courier collection from my home address, then that’s fine too so long as insurance is included but bear in mind it is a large and very heavy package.

HPZ1-1.jpgHPZ1-2.jpgHPZ1-3.jpgHPZ1-4.jpgHPZ1-5.jpgHPZ1-6.jpg

Go to Original Article
Author:

Salesforce acquisition of Tableau finally getting real

LAS VEGAS — It’s been more than five months since the Salesforce acquisition of Tableau was first revealed, but it’s been five months of waiting.

Even after the deal closed on Aug. 1, a regulatory review in the United Kingdom about how the Salesforce acquisition of Tableau might affect competition held up the integration of the two companies.

In fact, it wasn’t until last week on Nov. 5 after the go-ahead from the U.K. Competition and Markets Authority (CMA) — exactly a week before the start of Tableau Conference 2019, the vendor’s annual user conference — that Salesforce and Tableau were even allowed to start speaking with each other. Salesforce’s big Dreamforce 2019 conference is Nov. 19-22.

Meanwhile, Tableau didn’t just stop what it was doing. The analytics and business intelligence software vendor continued to introduce new products and update existing ones. Just before Tableau Conference 2019, it rolled out a series of new tools and product upgrades.

Perhaps most importantly, Tableau revealed an enhanced partnership agreement with Amazon Web Services entitled Modern Cloud Analytics that will help Tableau’s many on-premises users migrate to the cloud.

Andrew Beers, Tableau’s chief technology officer, discussed the recent swirl of events in a two-part Q&A.

In Part I, Beers reflected on Tableau’s product news, much of it centered on new data management capabilities and enhanced augmented intelligence powers. In Part II, he discusses the Salesforce acquisition of Tableau and what the future might look like now that the $15.7 billion purchase is no longer on hold.

Will the Salesforce acquisition of Tableau change Tableau in any way?

Andrew Beers: It would be naïve to assume that it wouldn’t. We are super excited about the acceleration that it’s going to offer us, both in terms of the customers we’re talking to and the technology that we have access to. There are a lot of opportunities for us to accelerate, and as [Salesforce CEO] Marc Benioff was saying [during the keynote speech] on Wednesday, the cultures of the two companies are really aligned, the vision about the future is really aligned, so I think overall it’s going to mean analytics inside businesses is just going to move faster.

Technologically speaking, are there any specific ways the Salesforce acquisition of Tableau might accelerate Tableau’s capabilities?

Andrew BeersAndrew Beers

Beers: It’s hard to say right now. Just last week the CMA [order] was lifted. There was a big cheer, and then everyone said, ‘But wait, we have two conferences to put on.’

Have you had any strategic conversations with Salesforce in just the week or so since regulatory restrictions were lifted, even though Tableau Conference 2019 is this week and Salesforce Dreamforce 2019 is next week?

Beers: Oh sure, and a lot of it has been about the conferences of course, but there’s been some early planning on how to take some steps together. But it’s still super early.

Users, of course, fear somewhat that what they love about Tableau might get lost as a result of the Salesforce acquisition of Tableau. What can you say to alleviate their worries?

Beers: The community that Tableau has built, and the community that Salesforce has built, they’re both these really excited and empowered communities, and that goes back to the cultural alignment of the companies. As a member of the Tableau community, I would encourage people to be excited. To have two companies come together that have similar views on the importance of the community, the product line, the ecosystem that the company is trying to create, it’s exciting.

Is the long-term plan — the long-term expectation — for Tableau to remain autonomous under Salesforce?

We’ve gone into this saying that Tableau is going to continue to operate as Tableau, but long-term, I can’t answer that question. It’s really hard for anyone to say.
Andrew BeersChief technology officer, Tableau

Beers: We’ve gone into this saying that Tableau is going to continue to operate as Tableau, but long-term, I can’t answer that question. It’s really hard for anyone to say.

From a technological perspective, as a technology officer, what about the Salesforce acquisition of Tableau excites you — what are some things that Salesforce does that you can’t wait to get access to?

Beers: Salesforce spent the past 10 or so years changing into a different company, and I’m not sure a lot of people noticed. They went from being a CRM company to being this digital-suite-for-the-enterprise company, so they’ve got a lot of interesting technology. Just thinking of analytics, they’ve built some cool stuff with Einstein. What does that mean when you bring it into the Tableau environment? I don’t know, but I’m excited to find out. They’ve got some interesting tools that hold their hold ecosystem together, and I’m interested in what that means for analysts and for Tableau. I think there are a lot of exciting technology topics ahead of us.

What about conversations you might have with Salesforce technology officers, learning from one another. Is that exciting?

Beers: It’s definitely exciting. They’ve been around — a lot of that team has different experience than us. They’re experienced technology leaders in this space and I’m definitely looking forward to learning from their wisdom. They have a whole research group that’s dedicated to some of their longer term ideas, so I’m looking forward to learning from them.

You mentioned Einstein Analytics — do Tableau and Einstein conflict? Are they at odds in any way, or do they meld in a good way?

Beers: It’s still early days, but I think you’re going to find that they’re going to meld in a good way.

What else can you tell the Tableau community about what the future holds after the Salesforce acquisition of Tableau?

Beers: We’re going to keep focused on what we’ve been focusing on for a long time. We’re here to bring interesting innovations to market to help people work with their data, and that’s something that’s going to continue. You heard Marc Benioff and [Tableau CEO Adam Selipsky] talk about their excitement around that [during a conference keynote]. Our identity as a product and innovation company doesn’t change, it just gets juiced by this. We’re ready to go — after the conferences are done.

Go to Original Article
Author:

DreamWorks animates its cloud with NetApp Data Fabric

Although it’s only been around since 2001, DreamWorks Animation has several blockbuster movies to its credit, including How to Train Your Dragon, Kung Fu Panda, Madagascar and Shrek. To get the finished product ready for the big screen, digital animators at the Hollywood, Calif., studio share huge data sets across the internal cloud, built around NetApp Data Fabric and its other storage technologies.

An average film project takes several years to complete and involves up to tens of thousands of data sets. At each stage of production, different animation teams access the content to add to or otherwise enhance the digital images, with the cloud providing the connective tissue. The “lather, rinse, repeat” process occurs up to 600 times per frame, said Skottie Miller, a technology fellow at the Los Angeles-area studio.

“We don’t make films — we create data. Technology is our paintbrush. File services and storage is our factory floor,” Miller told an audience of NetApp users recently.

‘Clouds aren’t cheap’

DreamWorks has a mature in-house cloud that has evolved over the years. In addition to NetApp file storage, the DreamWorks cloud incorporates storage kits from Hewlett Packard Enterprise (HPE). The production house runs the Qumulo Core file system on HPE Apollo servers and uses HPE Synergy composable infrastructure for burst compute, networks and storage.

Miller said DreamWorks views its internal cloud as a “lifestyle, a way to imagine infrastructure” that can adapt to rapidly changing workflows.

“Clouds aren’t magic and they’re not cheap. What they are is capable and agile,” Miller said. “One of the things we did was to start acting like a cloud by realizing what the cloud is good at: [being] API-driven and providing agile resources on a self-service basis.”

We don’t make films — we create data. Technology is our paintbrush. File services and storage is our factory floor.
Skottie MillerTechnology fellow, DreamWorks Animation

DreamWorks set up an overarching virtual studio environment that provides production storage, analytics on the storage fabric and automated processes. The studio deploys NetApp All-Flash FAS to serve hot data and NetApp FlexCache for horizontal scale-out across geographies, especially for small files.

The DreamWorks cloud relies on various components of the NetApp Data Fabric. NetApp FAS manages creative workflows. NetApp E-Series block storage is used for postproduction. NetApp HCI storage (based on SolidFire all-flash arrays) serves Kubernetes clusters and a virtual machine environment.

To retire tape backups, DreamWorks added NetApp StorageGrid as back-end object storage with NetApp FabricPool tiering for cold data. The company uses NetApp SnapMirror to get consistent point-in-time snapshots. Along with StorageGrid, Miller said DreamWorks has adopted NetApp Data Availability Services (NDAS) to manage OnTap file storage across hybrid clouds.

“NDAS has an interesting characteristic. Imagine a cloud thumb drive with a couple hundred terabytes. You can use it to guard against cyberattacks or environmental disasters, or to share data sets with somebody else,” Miller said.

The need for storage analytics

The sheer size of the DreamWorks cloud — a 20 PB environment with more than 10,000 pieces of media — underscored the necessity for deep storage analytics, Miller said.

“We rely on OnTap automation for our day-to-day provisioning and for quality of service,” he said.

In addition to being a NetApp customer, DreamWorks and NetApp have partnered to further development of Data Fabric innovations.

A DreamWorks cloud controller helps inform development of NetApp Data Fabric under a co-engineering agreement. The cloud software invokes APIs in NetApp Kubernetes Service.

The vendor and customer have joined forces to build an OnTap analytics hub that streams telemetry data in real time to pinpoint anomalies and automatically open service tickets. DreamWorks relies on open source tools that it connects to OnTap using NetApp APIs.

Go to Original Article
Author:

OpenText’s Mark Barrenechea talks Google, AI, future of CMS

Since he took the helm as CEO of OpenText in 2012, Mark Barrenechea ushered the company through arguably the biggest change in content management technology’s history: On-premises applications migrated to the cloud, which in turn were broken down into content services.

His company, based in Waterloo, Ontario, serves customers all along the digital transformation spectrum. Some still use paper-based workflows and ingest enterprise content into applications hosted on premises. Others are all-cloud, and automate processes with the latest AI and machine learning tech. 

Barrenechea, who took on the added role of CTO in 2016, discussed where content management is going and how AI is changing everything about the technology, workflows and even the definition of content itself.

OpenText started in 1991 and has gone through many chapters in its history. How would you describe the current chapter?

Mark Barrenechea: We started out as a search company, evolved into a content management company in our second phase, and since have evolved into enterprise information management. [EIM] is a very wide, horizontal platform to manage, deliver, secure and exchange unstructured data.

OpenText CEO Mark Barrenechea and Google's Kevin Ichhpurani
The Google-OpenText partnership is deeper than with other cloud providers, OpenText CEO Mark Barrenechea said, pictured here with Google’s Kevin Ichhpurani discussing the companies’ joint plans during last July’s OpenText Enterprise World keynote.

Over the decades, OpenText acquired many companies. In the last few years, Dell EMC’s Documentum and Guidance Software stand out. How would you describe OpenText’s acquisition philosophy?

Barrenechea: We’re not a ‘public private equity firm.’ We’re a strategic acquirer, building a software platform for information management. Through that lens, we are going to remain an acquirer, remain a consolidator. [An acquisition] has to be a strategic fit in our EIM market. It also has to fill a green space for us, whether it be some functionality, industry or geography that can accelerate our time to market. It also has to meet our financial discipline around value, return on investment capital, payback and integration into the tech stack.

How has AI changed content management and content services over the last few years?

We’re going to support all the clouds, but we’re going deeper with Google in order to add capabilities to the OpenText platform.
Mark BarrenecheaCEO, OpenText

Barrenechea: Maybe this comes with time — we’ll see if it’s wisdom — but I’ve been in software more than 30 years, and I’ve seen a lot of trends come and go. B2B2C tech, dot-com, big data. AI is real because it’s the natural next extension to extreme automation. Once you automate a corporate process, and you automate it for a long time, and, if your data’s really good, you want to go into that data and learn as much as you can to create a better process, company or business model.

AI [can do that], but it’s also going to take time. I’m in a lot of discussions where we tried to go get this insight or outcome, but the process wasn’t quite complete yet. Or the data wasn’t quite right. Those are the battle fronts right now. I’ve seen a lot of progress in getting the process and data right and now AI and machine learning is producing very actionable insights, whether that’s into talent all the way through to field service preventive maintenance to cash collections. It’s also intersecting with GDPR and privacy as to what can be shared.   

OpenText introduced its own Magellan AI two years ago, but you’re also partnering with Google for AI services. To outsiders, it kind of looks like you’re competing with yourself.

Barrenechea: One of the things we’ve learned is that you have to bring in many sources of data — and keep enriching the data — to get actionable insights. We’ve been bringing new features into Magellan, but we need to look at third-party data sources as well and have a factory to be able to cleanse and merge these different data sources.

But it can’t all be just OpenText data and an OpenText tool to provide insight. So, the partnership with Google provides different tools, different data in different languages, facial recognition going to metadata, speech to metadata, being able to translate and transcribe. For us, it’s about enriching OpenText tools, and Google helps us do that at scale with market-proven technology. It’s quite complementary.

Are you doing the same with AWS and Microsoft Azure partnerships, or is Google a favored partner, and how does it figure into the OpenText roadmap?

Barrenechea: We support all of them, as well as other clouds hosted by Rackspace and global service integrators. A customer can come along and deploy on their own, or we can provide a managed service. But there are features in Google Cloud that we’re going to go deeper in: G Suite, the desktop products, browser, clickstream, transcription and AI. We’re going to support all the clouds, but we’re going deeper with Google in order to add capabilities to the OpenText platform.

What technologies on the OpenText roadmap should customers and prospective customers watch in the near term?

Barrenechea: First, technologies that enable sustainability and responsibility: Enabling ethical supply chains; enabling the circular economy from recycle, reuse, replant; supporting efficacy — track and trace the minerals that make up products. We have a lot of activity in our products to enable this over our business network and content services. It’s a real area to watch.

Second, the volume of content is exploding; handling that with our content services and business network.

In the next three to five years, the center of our world will not be a document. The center of content services will move from a document to an ID. Being able to capture all the metadata and transactions around an ID, whether it’s a person, application or thing and making everything machine-readable — voice, transcripts, facial recognition, photos videos, PDF — is front and center, what we’re working on.

Editor’s Note: This Q&A was edited for clarity and brevity.

Go to Original Article
Author:

Microsoft Redmond campus modernization – Construction update – Stories

Since we kicked off demolition in January 2019, there has been great progress on the Redmond campus modernization project. Check out the timelapse below to see some of the work that has been done thus far.

Credit: Skycatch

Here are some other fun facts about the construction efforts:

  • The square footage of the building demolition on east campus is equivalent to the total square feet of all thirty NFL football fields combined.
  • Concrete from the demolition would be enough to build 1.3 Empire State Buildings. One hundred percent of the concrete is being recycled, and some of it will come back to the site for use in the new campus.
  • We’ve recycled a variety of materials from the deconstructed buildings including carpets, ceiling tiles, outdoor lights and turf from the sports fields. As a result, we have diverted almost 95 percent of our demolition waste away from landfills.
  • The resources recycled from the demolition thus far includes 449,697 pounds (50 trucks full) of carpet and 284,400 pounds of ceiling tiles.
  • The majority of the furniture removed from the demolished buildings that will not be reused in other buildings, has been donated to local charities and nonprofit startups.
  • We’ve moved 1 million cubic yards of dirt and reached the bottom of the digging area for our underground parking facility, which will consolidate traffic and make our campus even more pedestrian and bike friendly.
  • We‘ve installed 51k feet of fiber optic cabling.  That’s just over 9.5 miles.
  • The Microsoft Art Program has relocated 277 art pieces, including an early Chihuly and a Ken Bortolazzo sculpture. These art pieces were placed across our Puget Sound buildings so they can continue to be enjoyed by employees and guests.
  • The drone video featured above, created by Skycatch, not only offers a unique view of the project, but the images have fed into 3D models of the site which are providing data to more effectively tackle challenges as they arise, plan ahead and monitor construction progress.
  • The project is actively coordinating over 100 different building information models containing over 2.8 million individual 3D building components.

We look forward to continuing this journey to modernize Microsoft’s workplaces. When completed, the project will provide additional proximity for teams who collaborate and an inspiring, healthy and sustainably responsible workplace where our employees can do their best work and grow their careers.

Continued thanks for your patience and flexibility during the construction phase. As a reminder, please allow extra time to get around campus and remind visitors to do the same. Always be cautious around the construction sites and remain mindful of safety notices and instructions.

Follow updates and developments as this project progresses and view the latest renderings on Microsoft’s Modern Campus site.

Go to Original Article
Author: Microsoft News Center

Oracle applications development EVP on Fusion, SaaS and what’s ahead

SAN FRANCISCO — Oracle executive vice president Steve Miranda has worked at the company since 1992 and leads all application development at the vendor. He was there well before Oracle made its acquisition-driven push against application rival SAP in the mid-2000s, with the purchases of PeopleSoft and Siebel.

In 2007, Oracle put Miranda in charge of Fusion Applications, the next-generation software suite that took a superset of earlier application functionality, added a modern user experience and embedded analytics, and offered both on-premises and cloud deployments. Fusion Applications became generally available in 2011, and since then the Oracle has continued to flesh out its portfolio with acquisitions and in-house development.

Of the three main flavors of cloud computing, SaaS has been by far the most successful for Oracle applications, as it draws in previously on-premises workloads and attracts new customers. The competition remains fierce, with Oracle jockeying not only with longtime rival SAP but also the likes of Salesforce and Workday.

Miranda spoke to TechTarget at Oracle’s OpenWorld conference in a conversation that covered Fusion’s legacy, the success of SaaS deployments compared with on-premises ones, Oracle’s app acquisitions of late and the road ahead.

Software project cost overruns and outright failures have been an unfortunate staple of the on-premises world. The same hasn’t happened in SaaS. Part of this is because SaaS is vendor-managed from the start, but issues like change management and training are still risk factors in the cloud. Explain what’s happening from your perspective.

We have a reasonably good track record, even in the on-premises days. The noticeable difference I’ve seen [with cloud] is as follows:

In on-premise, because you had a set version, and because you knew you’re going to move for years, you started the implementation, but you had to have everything, because there wasn’t another version coming [soon].

Now, inevitably, that meant it took a while. And then what that meant is your business sometimes changed. New requirements came in. That meant you had to change configuration, or buy a third-party [product] or customize. That meant the implementation pushed out. But [initially], you had this sort of one-time cliff, where you had to go or no-go. Because you weren’t going to touch the system, forever more, because that was sort of the way it was. Or maybe you look at years later. It put a tremendous amount of pressure [on customers].

Steve Miranda, executive vice president of applications,Oracle
Steve Miranda, executive vice president of Oracle applicationsdevelopment, addresses attendees Oracle OpenWorld last week.

So what happened was, while companies tried to control scope, because there wasn’t a second phase, or the second phase was way out, it was really hard to control scope.

In SaaS, the biggest shift that I’ve seen from customers is that mentality is all different, given that they know, by the nature of the product we’ve built, they’re going to get regular updates. Their mindset is “OK, we’re going to take advantage of new features. We’re going to continue more continually change our development process or our business process.”

Do last-minute things pop up? Sure. Do project difficulties pop up? Sure. But [you need] the willingness to say, “You know what? We’re going to keep phase one, the date’s not moving, which means your cost doesn’t move.”

In SaaS, projects aren’t perfect, sometimes there’s [a scope issue], but you have something live. You get some payback, and there’s some kind of finish line for that. That’s the biggest difference that I’ve seen.

The Fusion Applications portfolio and brand persists today and was a big focus at OpenWorld. But Fusion was announced in 2005, and became GA in 2011. That’s eight years ago. So in total, Fusion’s application architectural pattern is about 15 years old. How old is too old?

Are they old compared to on-premise products? Definitely not. Are they old compared to our largest SaaS competitor [Editor’s note: Salesforce]? No, that’s actually an older product.

Okay, now, just in a standalone way, is Fusion old? Well, I would say a lot of the technology is not old. We are updating to OCI, the latest infrastructure, we’ve moved our customers there. We are updating to the latest version of the Oracle database to an Autonomous Database. We’ve refreshed our UI once already, and in this conference, we announced the upcoming UI.

Now. If you go through every layer of the stack, and how it’s architected and how it’s built, you know, there’s some technical debt. It depends on what you mean by old.

We’re moving to more of a microservices architecture; we build that part a piece at a time. Once we get done with that, there’s going to be something else behind it. [Oracle CTO and chairman Larry Ellison] talked about serverless and elasticity of the cloud. We’re modifying the apps architecture to more fully leverage that.

So if the question is in hindsight, did we make mistakes? The biggest mistake for me personally is, look: We had a very large customer installed base across PeopleSoft Siebel, E-Business Suite, JD Edwards and the expectation from our customers, is when Oracle says we’ve got something, that they can move to it, and they can move to the cloud.

And so what we tried to do with Fusion V1, and one of the reasons it took us longer than anticipated is that we had this scope.

Any company now, it’s sort of cliche, they have this concept of minimum viable product. You introduce a product, and does it service all of the Oracle customer base? No. Will it serve a certain customer base? Sure, yeah. And then you get those customers and you add to it, you get more customers, you add to it, you improve it.

We had this vision of, let’s get a bigger and bigger scope. Had I done it over again? We’ve got a minimum viable product, we would announce it to a subset our customer and then some of this noise that you hear of like, oh, Oracle took too long, or Oracle’s late to markets or areas wouldn’t have been there.

I would argue in a lot of the areas, while it may have taken us longer to come to market, we came out with a lot more capabilities than our competitors right out the box, because we had a different mindset.

Oracle initially stressed how Fusion Applications could be run both on-premises and as SaaS, in part to ease customer concerns over the longer-term direction for Oracle applications. But most initial Fusion customers went with SaaS because running it on-premises was too complicated. Why did things play out that way?

While it may have taken us longer to come to market, we came out with a lot more capabilities than our competitors right out the box, because we had a different mindset.
Steve MirandaExecutive vice president of applications development, Oracle

I would take issue with the following: Let’s say we had the on-prem install, like, perfect. One button press, everything’s there. Do I think that we would have had a lot of uptake of Fusion on-premises as opposed SaaS? No. I think the SaaS model is better.

Did we optimize the on-premises install? No. We didn’t intentionally make it complicated. But, you know, we were focused on the SaaS market. We were [handling] the complexity. Was it perfect? No. Did that affect some customers? Yes. Did it affect the overall market? No, because I think SaaS was going to [win] anyway.

The classic debate for application customers and vendors is best-of-breed versus suites. Each approach has its own advantages and tradeoffs. Is this the status quo today, in SaaS? Has a third way emerged?

I don’t know if it’s a third way. We believe we have best-of-breed in many, many areas. Secondly, we believe in an integrated solution. Now let’s take that again. I view the customer as having three constituents they care about. They care about their own customers, they care about their employees and they care about their stakeholders, because public company, that’s shareholders, if it’s a private company, it’s different.

If you told me for any given company, there are two or five best-of-breed applications out for some niche feature that benefits one of those three audiences? OK. You go with it, no problem.

If you told me there were 20 or 50 best-of-breed options for a niche feature? It’s almost impossible for there to be that many niche features that matter to those three important people, particularly in areas where really we specialize in: ERP, supply chain, finance, HR, a lesser extent in CRM, slightly lesser in some parts of HR.

So this notion of “Oh, let’s best-of-breed everything.” Good luck. I mean, you could do it. But I don’t think you’re going to be happy because of the number of integrations. I don’t believe in that at all.

Let’s move forward to today. Apart from NetSuite in 2016, there haven’t been any mega-acquisitions in Oracle applications lately. Rather, it’s been around companies that play in the CX space, particularly ones focused on data collection and curation. What’s the thinking here?

Without data, you can automate a map, right? You can find out how to go from here to Palo Alto. No problem. You have in your phone, you can do directions, etc. But when you add data, and you turn on Waze, it gives you a different route, because you have real-time data, traffic alerts and road closures, it’s a lot more powerful.

And so we think real-time data matters, especially in CRM but also, frankly, in ERP. You might have a supplier and you have the other status, they go through an M&A, or other things. You want to have an ERP and CRM system that doesn’t ignore the outside world. You actually have data much more freely available today. You want to have a system that assumes that. So that’s our investment.

Oracle has recently drawn closer to Microsoft, forming a partnership around interoperability between Azure and Oracle Cloud Infrastructure. Microsoft is placing a big bet on Graph data connect, which pulls together information from its productivity software and customers’ internal business data. It seems like a place where your partnership could expand for mutual benefit.

I’m not going to answer that. I can’t comment on that. It’s a great question.

Go to Original Article
Author:

For Sale – Late 2015 27” iMac

This hasn’t been touched since my twins were born in January and it seems a shame for it to just sit there.

It’s the top off the shelf model from late 2015 and I have added another 8GB of RAM to a total of 16GB (either Crucial or Kingston, can’t remember but I only ever buy RAM from them. Edit: just remembered that this is original Apple RAM, bought from someone upgrading to 32GB).

3.3ghz i5
2TB fusion drive (so has the 128GB SSD)
16GB RAM
Radeon R9 M395 2GB graphics
5k 27” screen

It has spent its life sat on a desk in my man-cave, which is now a baby-junk cave, so is in excellent condition. Box is in the attic.

Will have a fresh install of Mojave for the new owner.

Asking £850 picked up from SO51 area

Price and currency: 850
Delivery: Goods must be exchanged in person
Payment method: Cash or BT
Location: Romsey, Hampshire
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

Azure and VMware innovation and momentum

Since announcing Azure VMware Solutions at Dell Technologies World this spring, we’ve been energized by the positive feedback we’ve received from our partners and customers who are beginning to move their VMware workloads to Azure. One of these customers is Lucky Brand, a leading retailer that is embracing digital transformation while staying true to its rich heritage. As part of their broader strategy to leverage the innovation possible in the cloud, Lucky Brand is transitioning several VMware workloads to Azure.

“We’re seeing great initial ROI with Azure VMware Solutions. We chose Microsoft Azure as our strategic cloud platform and decided to dramatically reduce our AWS footprint and 3rd Party co-located data centers. We have a significant VMware environment footprint for many of our on-premises business applications.

The strategy has allowed us to become more data driven and allow our merchants and finance analysts the ability to uncover results quickly and rapidly with all the data in a central cloud platform providing great benefits for us in the competitive retail landscape. Utilizing Microsoft Azure and VMware we leverage a scalable cloud architecture and VMware to virtualize and manage the computing resources and applications in Azure in a dynamic business environment.

Since May, we’ve been successfully leveraging these applications on the Azure VMware Solution by CloudSimple platform. We are impressed with the performance, ease of use and the level of support we have received by Microsoft and its partners.” 

Kevin Nehring, CTO, Lucky Brand

Expanding to more regions worldwide and adding new capabilities

Based on customer demand, we are excited to announce that we will expand Azure VMware Solutions to a total of eight regions across the US, Western Europe, and Asia Pacific by end of year.

In addition to expanding to more regions, we are continuing to add new capabilities to Azure VMware Solutions and deliver seamless integration with native Azure services. One example is how we’re expanding the supported Azure VMware Solutions storage options to include Azure NetApp Files by the end of the year. This new capability will allow IT organizations to more easily run storage intensive workloads on Azure VMware Solutions. We are committed to continuously innovating and delivering capabilities based on customer feedback.

Broadening the ecosystem

It is amazing to see the market interest in Azure VMware Solutions and the partner ecosystem building tools and capabilities that support Azure VMware Solutions customer scenarios.

RiverMeadow now supports capabilities to accelerate the migration of VMware environments on Azure VMware Solutions.

“I am thrilled about our ongoing collaboration with Microsoft. Azure VMware Solutions enable enterprise customers to get the benefit of cloud while still running their infrastructure and applications in a familiar, tried and trusted VMware environment. Add with the performance and cost benefits of VMware on Azure, you have a complete solution. I fully expect to see substantial enterprise adoption over the short term as we work with Microsoft’s customers to help them migrate even the most complex workloads to Azure.”

Jim Jordan, President and CEO, RiverMeadow

Zerto has integrated its IT Resilience Platform with Azure VMware Solutions, delivering replication and failover capabilities between Azure VMware Solution by CloudSimple, Azure and any other Hyper-V or VMware environments, keeping the same on-premises environment configurations, and reducing the impact of disasters, logical corruptions, and ransomware infections.

“Azure VMware Solution by CloudSimple, brings the familiarity and simplicity of VMware into Azure public cloud. Every customer and IT pro using VMware will be instantly productive with minimal or no Azure competency. With Zerto, VMware customers gain immediate access to simple point and click disaster recovery and migration capabilities between Azure VMware Solutions, the rest of Azure, and on-premises VMware private clouds. Enabled by Zerto, one of Microsoft’s top ISVs and an award-winning industry leader in VMware-based disaster recovery and cloud migration, delivers native support for Azure VMware Solutions. “

Peter Kerr, Vice President of Global Alliances, Zerto

Veeam Backup & Replication™ software is specialized in supporting VMware vSphere environments, their solutions will help customers meet the backup demands of organizations deploying Azure VMware Solutions.

“As a leading innovator of Cloud Data Management solutions, Veeam makes it easy for our customers to protect their virtual, physical, and cloud-based workloads regardless of where those reside. Veeam’s support for Microsoft Azure VMware Solutions by CloudSimple further enhances that position by enabling interoperability and portability across multi-cloud environments. With Veeam Backup & Replication, customers can easily migrate and protect their VMware workloads in Azure as part of a cloud-first initiative, create an Azure-based DR strategy, or simply create new Azure IaaS instances – all with the same proven Veeam solutions they already use today.”  

Ken Ringdahl, Vice President of Global Alliances Architecture, Veeam Software

Join us at VMworld

If you plan to attend VMworld this week in San Francisco, stop by our booth and witness Azure VMware Solutions in action; or sit down for a few minutes and listen to one of our mini theater presentations addressing a variety of topics such as Windows Virtual Desktop, Windows Server, and SQL Server on Azure in addition to Azure VMware Solutions!

Learn more about Azure VMware Solutions.

Go to Original Article
Author: Microsoft News Center

For Sale – Macbook Air (13-inch, Mid 2012) – 1.8ghz i5, 8GB Ram, 256GB SSD – £150

I bought this laptop direct from Apple in 2012, but since the start of the year I haven’t been using it so would like to move it on.

This is a well used laptop! I will do my best to describe any issues – but please understand that this is not in pristine condition.

I have recently wiped the machine and installed the latest version of OSX (Mojave).

Here are all of the things to bear in mind with this laptop.

1) The condition of the casing – there a various dings and knocks on the body of the laptop. I have tried to take pictures of them all, however there will undoubtable be more than I haven’t captured. These are all cosmetic issues.
2) The charger has been better days! It was eaten by the Roomba a couple of times and the plastic from the cable is coming away. It still works though.
3) Some of the antireflective glaze, or what every it is, has come away on the screen. Looks terrible when its off – not as noticeable when it is on.
4) There are a couple of bright spots in the bottom left of the screen.
5) The touch pad is a little funny. Clicks fine in the bottom left, but have to really put a lot of pressure on if clicking anywhere else. Taps work all over the touchpad.

All of that said it works fine, if you connected it up to a screen with a mouse and keyboard you’d be fine.

I still have the box and it will be sent in the box. Happy to answer any questions etc.

My preference is to sell as collection only (London SM6 or WC2B) however would consider delivery.

IMG_9035.jpg IMG_9028.jpg IMG_9029.jpg IMG_9030.jpg IMG_9031.jpg IMG_9032.jpg IMG_9033.jpg IMG_9034.jpg IMG_9036.jpg IMG_9037.jpg

Price and currency: £150
Delivery: Delivery cost is not included
Payment method: Bank Transfer or PayPal
Location: London SM6 or WC2B
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

For Sale – Late 2015 27” iMac

This hasn’t been touched since my twins were born in January and it seems a shame for it to just sit there.

It’s the top off the shelf model from late 2015 and I have added another 8GB of RAM to a total of 16GB (either Crucial or Kingston, can’t remember but I only ever buy RAM from them. Edit: just remembered that this is original Apple RAM, bought from someone upgrading to 32GB).

3.3ghz i5
2TB fusion drive (so has the 128GB SSD)
16GB RAM
Radeon R9 M395 2GB graphics
5k 27” screen

It has spent its life sat on a desk in my man-cave, which is now a baby-junk cave, so is in excellent condition. Box is in the attic.

Will have a fresh install of Mojave for the new owner.

Asking £850 picked up from SO51 area

Price and currency: 850
Delivery: Goods must be exchanged in person
Payment method: Cash or BT
Location: Romsey, Hampshire
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author: