Tag Archives: interesting

For Trade – (or Sale) Macbook Pro 2017 – 13″, i5 3.1GHz, Touch Bar, Warranty to Sept 2020

So this will be an interesting thread…

I’m looking to Part-Ex my existing MacBook Pro for one of lower value. It’s overkill for my needs (it’s that age old thing.. you buy something way more high spec than you need!)

Specs of MacBook are as follows:

Macbook Pro 13″ 2017 TouchBar Version
Core i5 @ 3.1GHz
250GB SSD
8GB RAM
Intel Iris Graphics are 1536MB
AppleCare Coverage to September 2020
Battery Cycle count is 4 (It’s pretty much lives on the mains)
Condition is excellent, the only reason I’m not classing it as mint is because its not brand new

Comes boxed with power adapter and would while preference would be to deal in person, I’m open to postage for the right offer. As you can imagine, insurance on the delivery will be at a premium, ballpark prices are about £40 delivery alone.

Ideally, I’d like to come out of this with a minimum of £200 cash from any prospective buyer depending on the item they are trading. Otherwise, I’m happy to sell for £1100. Even on the refurb store, this MacBook goes for £1489 before you purchase additional AppleCare.

Price and currency: £1150
Delivery: Delivery cost is included within my country
Payment method: BT/PPG
Location: Basingstoke
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

For Trade – (or Sale) Macbook Pro 2017 – 13″, i5 3.1GHz, Touch Bar, Warranty to Sept 2020

So this will be an interesting thread…

I’m looking to Part-Ex my existing MacBook Pro for one of lower value. It’s overkill for my needs (it’s that age old thing.. you buy something way more high spec than you need!)

Specs of MacBook are as follows:

Macbook Pro 13″ 2017 TouchBar Version
Core i5 @ 3.1GHz
250GB SSD
8GB RAM
Intel Iris Graphics are 1536MB
AppleCare Coverage to September 2020
Battery Cycle count is 4 (It’s pretty much lives on the mains)
Condition is excellent, the only reason I’m not classing it as mint is because its not brand new

Comes boxed with power adapter and would while preference would be to deal in person, I’m open to postage for the right offer. As you can imagine, insurance on the delivery will be at a premium, ballpark prices are about £40 delivery alone.

Ideally, I’d like to come out of this with a minimum of £200 cash from any prospective buyer depending on the item they are trading. Otherwise, I’m happy to sell for £1100. Even on the refurb store, this MacBook goes for £1489 before you purchase additional AppleCare.

Price and currency: £1150
Delivery: Delivery cost is included within my country
Payment method: BT/PPG
Location: Basingstoke
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

TLBleed attack can extract signing keys, but exploit is difficult

An interesting, new side-channel attack abuses the Hyper-Threading feature of Intel chips and can extract signing keys with near-perfect accuracy. But both the researchers and Intel downplayed the danger of the exploit.

Ben Gras, Kaveh Razavi, Herbert Bos and Cristiano Giuffrida, researchers at Vrije Universiteit’s systems and network security group in Amsterdam, said their attack, called TLBleed, takes advantage of the translation lookaside buffer cache of Intel chips. If exploited, TLBleed can allow an attacker to extract the secret 256-bit key used to sign programs, with a success rate of 99.8% on Intel Skylake and Coffee Lake processors and 98.2% accuracy on Broadwell Xeon chips.

However, Gras tweeted that users shouldn’t be too scared of TLBleed, because while it is “a cool attack, TLBleed is not the new Spectre.”

“The OpenBSD [Hyper-Threading] disable has generated interest in TLBleed,” Gras wrote on Twitter. “TLBleed is a new side-channel in that it shows that (a) cache side-channel protection isn’t enough: TLB still leaks information; (b) side-channel safe code that is constant only in the control flow and time but not data flow is unsafe; (c) coarse-grained access patterns leak more than was previously thought.”

Justin Jett, director of audit and compliance for Plixer LLC, a network traffic analysis company based in Kennebunk, Maine, said TLBleed is “fairly dangerous, given that the flaw allows for applications to gain access to sensitive memory information from other applications.” But he noted that exploiting the issue would prove challenging.

“The execution is fairly difficult, because a malicious actor would need to infect a machine that has an application installed that they want to exploit. Once the machine is infected, the malware would need to know when the application was executing code to be able to know which memory block the sensitive information is being stored in. Only then will the malware be able to attempt to retrieve the data,” Jett wrote via email. “This is particularly concerning for applications that generate encryption keys, because the level of security that the application is trying to create could effectively be reduced to zero if an attacker is able to decipher the private key.”

Intel also downplayed the dangers associated with TLBleed; the company has not assigned a CVE number and will not patch it.

“TLBleed uses the translation lookaside buffer, a cache common to many high-performance microprocessors that stores recent address translations from virtual memory to physical memory. Software or software libraries such as Intel Integrated Performance Primitives Cryptography version U3.1 — written to ensure constant execution time and data independent cache traces should be immune to TLBleed,” Intel wrote in a statement via email. “Protecting our customers’ data and ensuring the security of our products is a top priority for Intel, and we will continue to work with customers, partners and researchers to understand and mitigate any vulnerabilities that are identified.”

Jett noted that even if Intel isn’t planning a patch, it should do more to alert customers to the dangers of TLBleed.

“Intel’s decision to not release a CVE number is odd at best. While Intel doesn’t plan to patch the vulnerability, a CVE number should have been requested so that organizations could be updated on the vulnerability and software developers would know to write their software in a way that may avoid exploitation,” Jett wrote. “Without a CVE number, many organizations will remain unaware of the flaw.”

The researchers plan to release the full paper this week. And, in August, Gras will present on the topic at Black Hat 2018 in Las Vegas.

An Introduction to the Microsoft Hybrid Cloud Concept and Azure Stack

In the last years, so-called “cloud services” have become more and more interesting and some customers are already thinking of going 100% cloud. There are a lot of competing cloud products out there, but is there a universal description of a cloud service? This is what I will address here.

Let’s start with the basics. Since time began (by that I mean “IT history”) we have all been running our own servers in our own datacenters with our own IT employees. A result of this was that you had different servers for your company, all configured individually, and your IT guys had to deal with that high number of servers. This led to a heavy and increasing load on the IT administrators, no time for new services, often they even had no time to update the existing ones to mitigate the risk to be hacked. In parallel, the development teams and management expect IT to behave in an agile fashion which was impossible for them.

Defining Cloud Services

This is not a sustainable model and is where the cloud comes in. A cloud is a highly optimized standard service (out of the box) without any small changes in the configuration. Cloud Services provide a way to just use a service (compared to power from the power plug) with a predefined and guaranteed SLA (service level agreement). If the SLA breaks, you as the customer would even get money back. The issue with these services is that these servers need to run in a highly standardized setup, in highly standardized datacenters, which are geo-redundant around the world. When it comes to Azure, these datacenters are being run in so-called “regions” with a minimum of three datacenters per region.

In addition to this, Microsoft runs their own backbone (not the internet) to provide a high quality of services. Let’s say available bandwidth meets Quality of Services (QoS).

To say it in one sentence, a cloud service is a highly standardized IT service with guaranteed SLAs running in public datacenters available from everywhere around the world at high quality. In general, from the financial point of view, you pay it per user, services or other flexible unit and you could increase or decrease it, based on your current needs.

Cloud Services – your options

If you want to invest in cloud services, you will have to choose between:

  • A private Cloud
  • A public Cloud
  • A hybrid Cloud

A private cloud contains IT services provided by your internal IT team, but in a manner, you could even get as external service. It is being provided by your datacenter and only hosts services for your company or company group. This means you will have to provide the required SLA.

A public cloud describes IT services provided by a hosting service provider with a guaranteed SLA. The services are being provided by public datacenters and they are not being spun up individually just for you.

A hybrid cloud is a mixture between a public and a private cloud, or in other words “a hybrid cloud is an internet-connected private cloud with services that are being consumed as public cloud services”. Hybrid Cloud deployments can be especially useful if there is a reason not to move a service to a public cloud such as:

  • Intellectual property needs to be saved on company-owned dedicated services
  • Highly sensitive data (e.g. health care) is not allowed to be saved on public services
  • Lack of connectivity could break the public cloud if you are in a region with poor connectivity

Responsibility for Cloud Services

If you decide to go with public cloud services, the question is always how many of your network services are you willing to move to the public cloud?

The general answer should be the more services you can transfer to the cloud, the better your result. However, even the best-laid plans sometimes can be at the mercy of your internet connectivity as well, which can cut you off from these services if not planned for. Additionally, industry regulations have made a 100% cloud footprint difficult for some organizations. The hybrid solution is then the most practical option for the majority of business applications.

Hybrid Cloud Scenarios

These reasons drove the decision by Microsoft to provide Azure to you for your own datacenter in a packaged solution based on the same technology as within Azure. Azure itself has the main concept of working with REST-Endpoints and ARM templates (JSON files with declarative definitions for services). Additionally, Microsoft deemed that this on-premises Azure solution should not provide only IaaS, it should be able to run PaaS, too. Just like the public Azure cloud.

This basically means, that for a service to become available in this new on-prem “Azure Stack”, it must already be generally available (GA) in public Azure.

This solution is called “Azure Stack” and comes on certified hardware only. This makes sure, that you as the customer will get performance, reliability and scalability. That ones you expect from Azure will be with Azure Stack, too.

As of today, the following Hardware OEMs part of this initiative:

  • DELL
  • HPE
  • Lenovo
  • Cisco
  • Huawei
  • Intel/Wortmann
  • Fujitsu

The following services are available with Azure Stack today, but as it is an agile product from Microsoft, we will expect MANY interesting updates in the future.

With Azure Stack, Microsoft provides a simple way to spread services between on-premise and in the public cloud. Possible scenarios could be:

  • Disconnected scenarios (Azure Stack in planes or ships)
  • Azure Stack as your development environment for Azure
  • Low latency computing
  • Hosting Platform for MSPs
  • And many more

As we all know, IT is hybrid today in most of the industries all over the world. With the combination of Azure Stack and Azure, you will have the chance to fulfill the requirements and set up a unique cloud model for all of your company services.

Summary

As you have seen, Azure Stack brings public Azure to your datacenter with the same administration and configuration models you already know from public Azure. There is no need to learn twice. Training costs go down, the standardization gives more flexibility and puts fewer loads on the local IT Admins which gives them time to work on new solutions for better quality. Also, with cloud style licensing things becomes less complex, as things are simply based on a usage model. You could even link your Azure Stack licenses directly to an Azure Subscription.

As hybrid cloud services are the future for the next 10 years or even more, Azure and Azure Stack together can make your IT world the most successful that it ever was in the last 10 years and moving forward.

If you want to learn more about Azure stack, watch our webinar Future-proofing your Datacenter with Microsoft Azure Stack

How about you? Does your organization have interest in Azure Stack? Why or why not? We here on the Altaro Blog are interested! Let us know in the comments section below!

Thanks for reading!

What’s new with data protection systems? Everything

How did buying backup hardware get interesting again in this age of software-defined storage and the cloud?

Actually, it’s the software and the cloud that made data protection systems more interesting than they have been in years. The hardware hasn’t changed much since the early 2000s, when disk supplanted tape as the mainstream backup medium. Even the rise of flash in primary storage hasn’t changed the hard disk’s role in data protection systems.

But the game has changed for buying data protection systems, including the role of the hardware. Led by newcomers Cohesity and Rubrik, the targeted uses have expanded. Backup is now just a piece of secondary storage management, and the target includes the cloud as well as an on-premises disk appliance or library.

There’s a good chance your next backup hardware will come with backup software integrated instead of sold separately. That alone changes the buying dynamic. It may simplify the buying and integration processes, but it complicates the decision-making.

How we got here

Rubrik ($292 million in funding) and Cohesity ($160 million) have been the darlings of investors, and each claims it is raking in over $100 million in annual revenue after less than three years in the market. The storage establishment has also taken notice.

Following Cohesity and Rubrik‘s lead, backup vendors Dell EMC and Commvault have gone the integration route with their data protection systems. Backup software leader Veritas Technologies was already selling integrated systems, but is expanding beyond pure backup into data management. Where does that leave software-only vendors such as Veeam Software, which partners with backup hardware vendors but doesn’t sell in integrated appliance to compete with them? That was also Commvault’s situation until it began shipping converged appliances in late 2017.

There are fundamental differences in how vendors integrate software and the cloud on backup targets. That means comparison shopping isn’t always apples to apples.

The last time backup hardware buying went through this much change was during the early days of data deduplication around 2005. Startups such as Data Domain and Avamar (both now a part of Dell EMC) proved the value of shrinking the amount of data that had to be backed up, and the rest of the industry rushed to follow. That change helped make disk the dominant platform for backup.

Now, vendors are scrambling to replicate the integrated concept Rubrik and Cohesity began. Both vendors have roots in hyper-convergence pioneer Nutanix, and they call their technology hyper-converged secondary storage. Other vendors are picking up that term and expect the secondary market to undergo changes similar to what hyper-convergence put primary storage through.

Today’s backup choices

Backup targets are no longer always mere repositories that serve as insurance policies for backed-up data. They can often host platforms that combine backup, archiving, disaster recovery and copy data management. Besides those features, they must connect to the cloud and often manage data across clouds. They replace the need for separate backup software, because they work directly with applications, servers and hypervisors. They can scale up by adding nodes instead of forcing customers to buy a complete new library.

But you may not need that. A large enterprise might want the traditional disk backup library, especially if it already has a lot of critical data on it. And an organization that’s learned to trust its backup software might not want to trash it to use a Cohesity or Rubrik box with their software included. In other words, if you like your backup software, you should keep it. That might mean sticking with a traditional Dell EMC Data Domain-type data protection system as well.

But there are fundamental differences in how vendors integrate software and the cloud on backup targets. That means comparison shopping isn’t always apples to apples.

Choices of data protection systems include complete platforms like Cohesity, Commvault and Rubrik or systems that integrate the vendor’s backup software and don’t need separate media servers like Arcserve, Dell EMC Integrated Data Protection Appliances, Unitrends or Veritas. When buying backup hardware, you must consider factors such as the size and capacity of the system; the way you manage, secure and upgrade it; and how it scales.

With secondary storage use cases blending and factors such as the cloud as a tier, the need to protect software-as-a-service apps through cloud-to-cloud backup and compliance with the European Union’s General Data Protection Regulation becoming a reality, your next backup target will likely be significantly different from your last one.

From bitcoin to deprecated Java, a look at 2017’s top opinions

It’s always interesting looking back on the various topics that arose throughout the previous year in the world of Java and technology. TheServerSide typically likes to stick with fairly neutral feature articles, news pieces, tips and tutorials, but every once in a while a subject or an issue arises that calls for a bit of an opinion to be offered. Here’s a look at ten of the most popular opinion pieces for 2017 — based on comments — ranging from cleaning up deprecated Java methods in the Java SE API to the problems with the bitcoin blockchain.

10. The ignominy of working at Uber

It seems that the various issues surrounding Uber and other ride-sharing programs never once abated throughout 2017. But specifically of interest to developers were the unethical practices of creating applications such as Greyball, which geofenced law enforcement agencies, not to mention the allegations by former employee Keala Lusk regarding sexual harassment in the workplace. It all begs the question, if you had been a developer at Uber, would you include that fact on your professional resume? I certainly wouldn’t.

9. Diversity versus parity

Former Google employee James Damore raised a number of eyebrows with his blog post “Google’s Ideological Echo Chamber,” which criticized the search engine’s approach to diversity and inclusion. Of course, what exactly do people mean when they talk about diversity and inclusion? Is the goal really diversity or is the actual goal simply gender and ethnic parity? There’s a big difference between the two concepts.

8. Process and methods

It’s not wise to speak unkindly about Agile and DevOps, the two hottest methodologies and software development processes around today. But these tall poppies were growing a little high, and someone just needed to address the fact that, as effective as these two approaches are, they aren’t magical unicorns, and they won’t fix every problem a software development team will encounter.

7. Deprecated Java code

Sometimes a discussion on a boring topic such as deprecated Java methods sings to the hearts of the software developers. A snide little rant about using deprecated Java methods garnered far more social media shares than one would have expected.

6. Cloud-native Java

Perhaps the biggest trend in the world of Java development was the move to cloud-native Java architectures. But sometimes the push to re-architect the entire data center was a little over enthusiastic and organizations with more than sufficient infrastructures were made to feel there was a problem with systems that are working just fine. Far too often, fear mongering was driving the microservices and containers trends.

5. The 12-Factor app

How do you develop cloud-native Java applications? Some suggest the mantra of the 12-Factor app is the right place to start. I’d disagree, as I noted in this article.

4. Java EE APIs

I like JavaServer Faces and I like JSR-371, the new Model-View-Controller (MVC 1.0) framework for developing user interfaces. But I’ve never liked the fact that UI frameworks get included in the Java EE specification. They shouldn’t be. They should stand alone.

3. Cloud failures

The Amazon outage was big news in February. But the story was bigger than the simple fact that an AWS availability zone temporarily went down. It was about how that Amazon outage impacted the faith and trust clients have in their cloud computing vendors.

2. Explaining away the Amazon outage

Amazon explained away its zone outage as simple data input error. But when an entire availability zone goes down, there’s more going on than simply a user hitting the wrong key on the keyboard.

1. Bitcoin and blockchains

2017 will be remembered as they year bitcoin broke the $10,000 barrier. But just how fundamentally sound is the blockchain technology that underlies the cryptocurrency? This article on the issues surrounding bitcoin and blockchain technology not only attracted a large number of eyeballs, but it generated an equally large number of comments and shares on various social media sites.

2017 provided plenty of fodder for an opinionated take on both current and ongoing events that pervaded the software development industry. There is no doubt that 2018 will be equally as interesting, although it’ll be hard to top a year that included deprecated Java methods with DevOps unicorns prancing around.

Cabling across the Atlantic, designing cars with Microsoft HoloLens and pre-ordering Xbox One X – Weekend Reading: Sept. 22 edition – The Official Microsoft Blog

Microsoft was full of interesting news this week, from an innovative, transatlantic project to a Microsoft HoloLens partnership that powers creativity to Xbox and “Minecraft” announcements that make games more fun. Here’s a look.

Man in hard hat adjusts cable on a sandy beach
A worker adjusts the Marea cable on a beach in Spain. Image by RUN Studios.

Microsoft, Facebook and telecommunications infrastructure company Telxius have completed the highest-capacity subsea cable to cross the Atlantic. Called Marea (Spanish for “tide”), the 4,000-mile cable connects from Virginia to Spain, provides up to 160 terabits per second and will make transatlantic connections more resilient.

Dependable infrastructure is crucial for internet and cloud services, but it was the 2012 devastation of Hurricane Sandy, which shut down connectivity services on the East Coast for days, that drove home the need for Marea. The cable is expected to become operational in early 2018.

“Everyone expects that whenever they turn on their computer or their tablet or their phone, they’re going to work,” says Frank Rey, director of global network strategy for Microsoft’s Cloud Infrastructure and Operations division. “That’s what this cable is going to help enable.”

[embedded content]

Ford announced that it’s expanding its use of Microsoft HoloLens in designing vehicles, after successfully piloting the technology to improve creativity, collaboration and time to market.

The company’s design process has traditionally involved expensive, time-consuming clay models. But HoloLens enables Ford designers to blend 3D holograms with the models and physical production vehicles for easier creativity and faster iterations.

[embedded content]

In gaming news, pre-orders for the new Xbox One X console began this week at local retailers around the world, including Microsoft Store and Microsoft.com. More than 130 games, including “Far Cry 5” and “L.A. Noire,” will be enhanced for Xbox One X with higher resolutions and faster framerates to take advantage of the powerful console.

[embedded content]

The week’s other big gaming news was the arrival of Better Together, “Minecraft’s” biggest update ever. The update unites players on console, mobile, VR and Windows 10 versions into one Bedrock family. That means it’s time to say goodbye to “Minecraft: Xbox One Edition” and “Minecraft: Windows 10 Edition” and hello to a universal “Minecraft.”

[embedded content]

When it comes to security, staying ahead of threats is critical. That’s why Windows Defender Advanced Threat Protection (ATP) is integrating automated investigation and remediation capabilities from Hexadite, which Microsoft recently acquired.

“This takes enterprise security to a new level, enabling our customers to move faster from device, data and insight to action against modern-day threats,” writes Rob Lefferts, partner director, Windows & Devices Group, Security & Enterprise.

[embedded content]

Ever wonder how AI and the cloud help computers learn to read and comprehend natural language? The latest “Explanimators” episode from Microsoft Story Labs explores the world of machine reading – and how it will someday help doctors, lawyers and anyone who needs high-speed access to written, accumulated human knowledge.

That’s it for this week. Thanks for reading and see you next week!

Tags: Explanimators, Marea, Microsoft HoloLens, minecraft, Windows Defender ATP, Xbox One X