Tag Archives: also

Oracle ships Java 14 with new preview, productivity features

Oracle’s latest release of the Java language and platform, Java 14 — also known as Oracle JDK14 — brings a series of features focused on helping developers code faster and more efficiently.

The latest Java Development Kit (JDK) provides new developer-focused features including Java language support for switch expressions, new APIs for continuous monitoring of JDK Flight Recorder data, and extended availability of the low-latency Z Garbage Collector to macOS and Windows.

In addition, Java 14 includes three preview features that come out of the JDK Enhancement Proposals (JEP) process. These are Pattern Matching, or JEP 305; Records, or JEP 359; and Text Blocks, also known as JEP 368.

Java 12 introduced switch expressions in preview, and it is now standard in Java 14. This feature extends the Java switch statement so it can be used as either a statement or an expression. “Basically, we converted the switch statement into an expression and made it much simpler and more concise,” said Aurelio Garcia-Ribeyro, Oracle’s Sr. Director of Product Management, Java Platform.

 Oracle will give developers a way to spot errors by continuously monitoring the JDK Flight Recorder, a tool integrated into the Java Virtual Machine for collecting diagnostic and profiling data about a running Java application.

Finally, the z Garbage Collector, also known as ZGC, is a scalable, low-latency garbage collector. Garbage collection is a form of automatic memory management that frees up memory that is no longer in use or needed by the application. Prior to the Windows and MacOS support introduced with Java 14, the z Garbage collector was available only on Linux/x64 platforms.

As for the preview features, Oracle has developed pattern matching for the Java “instanceof” operator. The instanceof operator is used to test if an object is of a given type. In turn, the introduction of Java Records cuts down on the verbosity of Java and provides a compact syntax for declaring classes.

“Records will eliminate a lot of the boilerplate that has historically been needed to create a class,” Garcia-Ribeyro said.

Text Blocks, initially introduced in Java 13 as a preview, returns as an enhanced preview in Java 14. Text Blocks make it easy to express strings that span several lines of source code. It enhances the readability of strings in Java programs that denote code written in non-Java languages, Garcia-Ribeyro said.

Oracle needs to give Java developers the types of tools they need to evolve with the marketplace, said Bradley Shimmin, an analyst at Omdia in Longmeadow, Mass.

“When I look at what they’re doing with Java 14, they’re adding features that make the language more resilient, more performant and that make developers more productive in using the language,” he said.

Oracle takes iterative approach to Java updates

Java 14 also includes a new Packaging Tool, introduced as an incubator feature, that provides a way for developers to package Java applications for distribution in platform-specific formats. This tool is introduced as an incubator module to get developer feedback as the tool nears finalization.

Among the more obscure features in this release are Non-Volatile Mapped Byte Buffers, which add a file mapping mode for the JDK when using non-volatile memory. Also, Helpful NullPointerExceptions improves the usability of NullPointerExceptions by describing precisely which variable was null. NullPointerExceptions are exceptions that occur when you try to use a reference that points to no location in memory as though it were referencing an object. And the Foreign-Memory Access API allows Java programs to safely access foreign memory outside of the Java heap. The Java heap is the amount of memory allocated to applications running in the JVM.

Java 14 is another new release of the language under the six-month cadence Oracle instituted more than two years ago. The purpose of the quicker cadence of releases is to get “more bite-size pieces that are easier to deploy and manage and that get the features to app developers in the enterprise to benefit from these new capabilities,” said Manish Gupta, Oracle’s Vice President of Marketing for Java and GraalVM.

Overall, Oracle wants to advance the Java language and platform to make it work well for new cloud computing applications as well as platforms such as mobile and IoT. In 2017, Oracle spun out enterprise Java, known as Java Enterprise Edition or JavaEE, to the Eclipse Foundation. Eclipse has since created a new enterprise Java specification called Jakarta EE.

“When I think about Java 14, what I’m seeing is that Oracle is not only staying true to what they promised back when they acquired Sun Microsystems, which was to do no harm to Java, but that they are trying to now evolve Java in such a way that it can remain relevant into the future,” Shimmin said.

Go to Original Article
Author:

Splice Machine 3.0 integrates machine learning capabilities, database

Databases have long been used for transactional and analytics use cases, but they also have practical utility to help enable machine learning capabilities. After all, machine learning is all about deriving insights from data, which is often stored inside a database.

San Francisco-based database vendor Splice Machine is taking an integrated approach to enabling machine learning with its eponymous database. Splice Machine is a distributed SQL relational database management system that includes machine learning capabilities as part of the overall platform.

Splice Machine 3.0 became generally available on March 3, bringing with it updated machine learning capabilities. It also has new Kubernetes cloud native-based model for cloud deployment and enhanced replication features.

In this Q&A, Monte Zweben, co-founder and CEO of Splice Machine, discusses the intersection of machine learning and databases and provides insight into the big changes that have occurred in the data landscape in recent years.

How do you integrate machine learning capabilities with a database?

Monte ZwebenMonte Zweben

Monte Zweben: The data platform itself has tables, rows and schema. The machine learning manager that we have native to the database has notebooks for developing models, Python for manipulating the data, algorithms that allow you to model and model workflow management that allows you to track the metadata on models as they go through their experimentation process. And finally we have in-database deployment.

So as an example, imagine a data scientist working in Splice Machine working in the insurance industry. They have an application for claims processing and they are building out models inside Splice Machine to predict claims fraud. There’s a function in Splice Machine called deploy, and what it will do is take a table and a model to generate database code. The deploy function builds a trigger on the database table that tells the table to call a stored procedure that has the model in it for every new record that comes in the table.

So what does this mean in plain English? Let’s say in the claims table, every time new claims would come in, the system would automatically trigger, grab those claims, run the model that predicts claim cause and outputs those predictions in another table. And now all of a sudden, you have real-time, in-the-moment machine learning that is detecting claim fraud on first notice of loss.

What does distributed SQL mean to you?

Zweben: So at its heart, it’s about sharing data across multiple nodes. That provides you the ability to parallelize computation and gain elastic scalability. That is the most important distributed attribute of Splice Machine.

In our new 3.0 release, we just added distributed replication. It’s another element of distribution where you have secondary Splice Machine instances in geo-replicated areas, to handle failover for disaster recovery.

What’s new in Splice Machine 3.0?

Zweben: We moved our cloud stack for Splice Machines from an old Mesos architecture to Kubernetes. Now our container-based architecture is all Kubernetes, and that has given us the opportunity to enable the separation of storage and compute. You literally can pause Splice Machine clusters and turn them back on. This is a great utility for consumption based usage of databases.

Along with our upgrade to Kubernetes, we also upgraded our machine learning manager from an older notebook technology called Zeppelin to a newer notebook technology that has really gained momentum in the marketplace, as much as Kubernetes has in the DevOps world. Jupyter notebooks have taken off in the data science space.

We’ve also enhanced our workflow management tool called mlflow, which is an open source tool that originated with Databricks and we’re part of that community. Mlflow allows data scientists to track their experiments and has that record of metadata available for governance.

What’s your view on open source and the risk of a big cloud vendor cannibalizing open source database technology?

Zweben: We do compose many different open source projects into a seamless and highly performant integration. Our secret sauce is how we put these things together at a very low level, with transactional integrity, to enable a single integrated system. This composition that we put together is open source, so that all of the pieces of our data platform are available in our open source repository, and people can see the source code right now.

I’m intensely worried about cloud cannibalization. I switched to an AGPL license specifically to protect against cannibalization by cloud vendors.

On the other hand, we believe we’re moving up the stack. If you look at our machine learning package, and how it’s so inextricably linked with the database, and the reference applications that we have in different segments, we’re going to be delivering more and more higher-level application functionality.

What are some of the biggest changes you’ve seen in the data landscape over the seven years you’ve been running Splice Machine?

Zweben: With the first generation of big data, it was all about data lakes, and let’s just get all the data the company has into one repository. Unfortunately, that has proven time and time again, at company after company, to just be data swamps.

Data repositories work, they’re scalable, but they don’t have anyone using the data, and this was a mistake for several reasons.

Instead of thinking about storing the data, companies should think about how to use the data.
Monte ZwebenCo-founder and CEO, Splice Machine

Instead of thinking about storing the data, companies should think about how to use the data. Start with the application and how you are going to make the application leverage new data sources.

The second reason why this was a mistake was organizationally, because the data scientists who know AI were all centralized in one data science group, away from the application. They are not the subject matter experts for the application.

When you focus on the application and retrofit the application to make it smart and inject AI, you can get a multidisciplinary team. You have app developers, architects, subject-matter experts, data engineers and data scientists, all working together on one purpose. That is a radically more effective and productive organizational structure for modernizing applications with AI.

Go to Original Article
Author:

How to create and deploy a VMware VM template

A VMware VM template — also known as a golden image — is a perfect copy of a VM from which you can deploy identical VMs. Templates include a VM’s virtual disks and settings, and they can not only save users time but help them avoid errors when configuring new Windows and Linux VMs.

VM templates enable VMware admins to create exact copies of VMs for cloning, converting and deploying. They can be used to simplify configuration and ensure the standardization of VMs throughout your entire ecosystem. Templates can also be used as long-term backups of VMs. However, you can’t operate a VM template without converting it back to a standard VM.

VSphere templates can be accessed through your content library. The content library wizard will then walk you through configuration steps, such as publishing and optimizing templates. It designates roles and privileges that you can then assign to users, and it eases VM deployment options.

Best practices for Hyper-V templates

You can create and deploy VMware VM templates through Hyper-V, as well. Hyper-V templates enable users to deploy VMs quickly with greater security, such as with shielded VMs, and reduce network congestion. They rely on System Center Virtual Machine Manager (SCVMM) and require specific configurations.

To create a Hyper-V template, select a base object from which you want to create the template — an extant VM template, a virtual hard disk or a VM. Assign a name to the new template and configure the virtual hardware and operating settings the deployed VM will use.

Keep in mind that not every VM is a viable template candidate. If your system partition is not the same as your Windows partition, you won’t be able to use that VM as a template source.

To create a shielded VM — one that protects against a compromised host — run the Shielded Template Disk Creation Wizard. Specify your required settings in the wizard and click Generate to produce the template disk, then copy that disk to your template library. The disk should appear in your content library with a small shield icon, which signifies that it has shielded technology.

How to create a VMware VM template with Packer

Packer is a free tool that can help you automate vSphere template creation and management. It features multiple builders optimized for VMware Fusion, Workstation Pro or Workstation Player. The vmware-iso Packer plugin builder supports using a remote ESXi server to build a template, and the vsphere-iso plugin helps you connect to a vCenter environment and build on any host in a cluster.

When you use Packer to make a VM template, you use two main file types. The JSON file makes up the template, and the autounattend.xml file automates Windows installation on your VM. Once your scripts, JSON file and autounattend file are ready, you can build a VM template in Packer. When the build is complete, Packer converts the VM to a template that you can view and deploy through PowerCLI.

Use PowerCLI to deploy a template

You can use PowerCLI to deploy new VMs from a template. Create an OS customization specification through PowerCLI to start the deployment process and to ensure that when you create your VMs from a template, you can still change certain settings to make them unique. These settings would include organization name, security identifier, local administrator password, Active Directory domain, time zone, domain credentials, Windows product key and AutoLogonCount registry key. The PowerCLI cmdlet might resemble the following:

C:> New-OSCustomizationSpec -Name ‘WindowsServer2016’ -FullName ‘TestName’ -OrgName ‘MyCompany’ -OSType Windows -ChangeSid -AdminPassword (Read-Host -AsSecureString) -Domain ‘NTDOMAIN’ -TimeZone 035 -DomainCredentials (Get-Credential) -ProductKey ‘5555-7777-3333-2222’ -AutoLogonCount 1

After your OS is customized, you can easily deploy a VM from a template or multiple VMs from the same template. Start by placing the OS customization specifications into the variable $Specs.

$Specs = Get-OSCustomizationSpec -Name ‘WindowsServer2016’

Then, use the VM template in the variable $Template.

$Template = Get-Template -Name ‘ Windows2016Template’

Finish by deploying your VM using the New-VM cmdlet and piping in your template and OS specifications.

New-VM -Name ‘Windows16VM’ -Template $Template -OSCustomizationSpec $Spec -VMHost ‘ESXiHost’ -Datastore ‘VMDatastore’

Troubleshoot VM templates

Joining a VM to an Active Directory domain can cause the system to create a computer account for the VM, which then leaves that computer account orphaned during the template creation process.

There are a few common mistakes to VM template creation and deployment that you’ll want to avoid.

Creating a VMware template directly from a VM ends up destroying the VM. Always create a clone of a VM prior to creating a template from one. Even if you create a VM solely to become a template, template creation could fail and destroy your VM. A common reason for template creation failure is trying to create a template from a Linux VM. In that case, the template creation process wants to Sysprep a VM but Sysprep is designed for Windows OSes.

You also need to ensure that the model VM you want to turn into a template isn’t domain-jointed. Joining a VM to an Active Directory domain can cause the system to create a computer account for the VM, which then leaves that computer account orphaned during the template creation process. To work around this issue, have the template itself handle the domain join and secure the library share in a way that prevents anyone other than VM admins from having access.

Finally, don’t include any preinstalled applications on a VM template. The Sysprep process often breaks such applications. You can instead use an application profile or configure a VM template to run a script for automated application installation.

Go to Original Article
Author:

For Sale – Huawei Matebook X Pro – i7, 512GB, MX150 W/Warranty Apr2020

Hi guys

I’ve decided to let this go, I’ve not got much use out of it since purchasing new – I’m also considering embarking on building a gaming desktop PC, so this will help with funding that. I have experienced no issues with the device.

Once received I immediately installed a ‘dbrand’ Matte Black skin to the Top, Bottom and TrackPad
(This can easily be removed should you wish to, but why would you…). This has therefore kept the device immaculate and free from any scratches.

This will also come with a ‘MagSafe’ style USB C Charging cable and Port adapter which is similar to Apple’s innovation; This protects the device from any damage as demonstrated in the video –

Warranty: Valid with Huawei until 05Apr2020

This comes boxed with all accessories:
– USB Type C Port (HDMI, USB A and USB Type C ports)
– USB Type C Charging Cable (3m I believe?!)
– Charging Adapter
Extras:
1x MagSafe style USB Type C Cable and Adapter

This is the top spec model which includes:
CPU: Intel core i7 8550U
GPU: Nvidia MX150 + Intel UHD 620
RAM: 8GB
SSD: 512GB
Screen: 3000 x 2000 touch screen

Postage: This will be shipped by a next day delivery option or please feel free to inspect and collect.

Note: this is already advertised elsewhere

If you require any further pics or anything drop a message

Go to Original Article
Author:

For Sale – 6 x 4TB & 12 x 2TB HDD’s

I’m also quite interested in a couple drives – could you update the thread with what you have left?

Cheers

Go to Original Article
Author:

For Sale – 6 x 4TB & 12 x 2TB HDD’s

I’m also quite interested in a couple drives – could you update the thread with what you have left?

Cheers

Go to Original Article
Author:

For Sale – Lenovo Dock 04W3947 – TBS 6281 DVB-T2 TV Tuner Card – 1 x 16GB Crucial DDR3L 1866 Notebook Ram

Lenovo T450S Sold

I also have a Lenovo Dock 04W3947, the Lenovo simply clicks into this and is provided with 1 x HDMI port, 6 x USB ports, 1 x Ethernet, 1 x VGA, 1 x DVI and a 2 x DisplayPort. There are also two power cables, one which can be carried with the laptop and one which can be left hooked up to the dock.

Dock – £80

I then seperately have the following for sale:

TBS 6281-SE Dual Terrestrial HD Low-profile PCIe TV Tuner Card (DVB-T2) for sale, in great condition, I used it to watch live TV on my HTPC via a Kodi / Mediaportal combination, works great – £50

A single stick of 16GB Crucial DDR3L SODIMM 1866 RAM – £80

Price and currency: £80, £50 & £80
Delivery: Delivery cost is included within my country
Payment method: PPG or BT
Location: Aberdeen
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

For Sale – Lenovo Dock 04W3947 – TBS 6281 DVB-T2 TV Tuner Card – 1 x 16GB Crucial DDR3L 1866 Notebook Ram

Lenovo T450S Sold

I also have a Lenovo Dock 04W3947, the Lenovo simply clicks into this and is provided with 1 x HDMI port, 6 x USB ports, 1 x Ethernet, 1 x VGA, 1 x DVI and a 2 x DisplayPort. There are also two power cables, one which can be carried with the laptop and one which can be left hooked up to the dock.

Dock – £80

I then seperately have the following for sale:

TBS 6281-SE Dual Terrestrial HD Low-profile PCIe TV Tuner Card (DVB-T2) for sale, in great condition, I used it to watch live TV on my HTPC via a Kodi / Mediaportal combination, works great – £50

A single stick of 16GB Crucial DDR3L SODIMM 1866 RAM – £80

Price and currency: £80, £50 & £80
Delivery: Delivery cost is included within my country
Payment method: PPG or BT
Location: Aberdeen
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

AWS AI tools focus on developers

AWS is the undisputed leader in the cloud market. As for AI, the cloud division of tech giant Amazon is also in a dominant position.

“Machine learning is at a place now where it is accessible enough that you don’t need Ph.Ds,” said Joel Minnick, head of product marketing for AI, machine learning and deep learning at AWS.

Partly, that’s due to a natural evolution of the technology, but vendors such as Google, AWS, IBM, DataRobot and others have made strides in making the process of creating and deploying machine learning and deep learning easier.

AWS AI

Over the last few years, AWS has invested heavily in making it easier for developers and engineers to create and deploy AI models, Minnick said, speaking with TechTarget at the AWS re:Invent 2019 user conference in Las Vegas in December 2019.

AWS’ efforts to simplify the machine leaning lifecycle were on full display at re:Invent. During the opening keynote, led by AWS CEO Andy Jassy, AWS revealed new products and updates for Amazon SageMaker, AWS’ full-service suite of machine learning development, deployment and governance products.

Those products and updates included new and enhanced tools for creating and managing notebooks, automatically making machine learning models, debugging models and monitoring models.

SageMaker Autopilot, a new AutoML product, in particular, presents an accessible way for users who are new to machine learning to create and deploy models, according to Minnick.

In general, SageMaker is one of AWS’ most important products, according to a blog-post-styled report on re:Invent from Nick McQuire, vice president of enterprise research at CCS Insight. The report noted that AWS, due largely to SageMaker, its machine learning-focused cloud services, and a range of edge and robotics products, is a clear leader in the AI space.

“Few companies (if any) are outpacing AWS in machine learning in 2019,” McQuire wrote, noting that SageMaker alone received 150 updates since the start of 2018.

Developers for AWS AI

In addition to the SageMaker updates, AWS in December unveiled another new product in its Deep series: DeepComposer.

The product series, which also includes DeepLens and DeepRacer, is aimed at giving machine learning and deep learning newcomers a simplified and visual means to create specialized models.

Introduced in late 2017, DeepLens is a camera that enables users to run deep learning models on it locally. The camera, which is fully programmable with AWS Lambda, comes with tutorials and sample projects to help new users. It integrates with a range of AWS products and services, including SageMaker and its Amazon Rekognition image analysis service.

“[DeepLens] was a big hit,” said Mike Miller, director of AWS AI Devices at AWS.

DeepRacer, revealed the following year, enables users to apply machine learning models to radio controlled (RC) model cars and make them autonomously race along tracks. Users can build models in SageMaker and bring them into a simulated racetrack, where they can train the models before bringing them into a 1/18th scale race car.

An AWS racing league makes DeepRacer competitive, with AWS holding yearlong tournaments comprised of multiple races. DeepRacer, Miller declared, has been exceedingly successful.

“Tons of customers around the world have been using DeepRacer to engage and upskill their employees,” Miller said.

Dave Anderson, director of technology at Liberty Information Technology, the IT arm of Liberty Mutual, said many people on his team take part in the DeepRacer tournaments.

“It’s a really fun way to learn machine learning,” Anderson said in an interview. “It’s good fun.”

Composing with AI

Meanwhile, DeepComposer as the name suggests, helps train users on machine learning and deep learning through music. The product comes with a small keyboard that can plug into a PC along with a set of pretrained music genre models. The keyboard itself isn’t unusual, but by using the models and accompanying software, users automatically create and tweak fairly basic pieces of music within a few genres.

With DeepComposer, along with DeepLens and Deep Racer, “developers of any skill level can find a perch,” Miller said.

The products fit into Amazon’s overall AI strategy well, he said.

“For the last 20 years, Amazon has been investing in machine learning,” Miller said. “Our goal is to bring those same AI and machine learning techniques to developers of all types.”

The Deep products are just “the tip of the spear for aspiring machine learning developers,” Miller said. Amazon’s other products, such as SageMaker, extend that machine learning technology development strategy.

“We’re super excited to get more machine learning into the hands of more developers,” Miller said.

Go to Original Article
Author:

For Sale – Lenovo Dock 04W3947 – TBS 6281 DVB-T2 TV Tuner Card – 1 x 16GB Crucial DDR3L 1866 Notebook Ram

Lenovo T450S Sold

I also have a Lenovo Dock 04W3947, the Lenovo simply clicks into this and is provided with 1 x HDMI port, 6 x USB ports, 1 x Ethernet, 1 x VGA, 1 x DVI and a 2 x DisplayPort. There are also two power cables, one which can be carried with the laptop and one which can be left hooked up to the dock.

Dock – £80

I then seperately have the following for sale:

TBS 6281-SE Dual Terrestrial HD Low-profile PCIe TV Tuner Card (DVB-T2) for sale, in great condition, I used it to watch live TV on my HTPC via a Kodi / Mediaportal combination, works great – £50

A single stick of 16GB Crucial DDR3L SODIMM 1866 RAM – £80

Price and currency: £80, £50 & £80
Delivery: Delivery cost is included within my country
Payment method: PPG or BT
Location: Aberdeen
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author: