For Sale – Skull Canyon NUC + Other NUCs

——————————————-
= 1 =
This is up for sale bought from here..

For Sale – Intel NUC8i7HVK NUC w/500Gb Evo 970 and 16Gb DDR

I bought 32gb of RAM as it was needed for the project, so you have an option of 16 or 32gb RAM

£715 with 16gb Ram SOLD

Both prices include postage
Will post pictures later this evening.

——————————————-
= 2 =
I have an i5 based NUC in a Fanless Tranquil case.
The NUC is a D54250WYK and specs can be found here:

Intel® NUC Kit D54250WYK Product Specifications

Tranquil PC H22 Fanless NUC Chassis

And here is the info about the other parts..

  • I5 Intel Haswell NUC – Intel® Core™ i5-4250U Processor (3M Cache, up to 2.60 GHz)
  • Tranquil PC H22 Fanless NUC Chassis (supports additional 2.5″ HDD)
  • 8GB G.Skill Ripjaws DDR3 2133MHz SO-DIMM Low-voltage 1.35V laptop memory kit (1x 8GB) CL11
  • Samsung 840 EVO 120GB mSATA Solid State Drive
  • Intel Dual Band Wireless-AC 7260 Wi-Fi and Bluetooth card
  • Windows 10 Home License (ESD)


History: I bought this off someone here and it originally had 16GB of RAM, but on arrival it didn’t recognise the full 16GB of RAM.
I removed the memory one by one and found that it wasn’t recognising one of the slots, so removed the RAM and have been using it with 8GB ever since.

I have been using this mainly for occasional use to run Windows 10 for my indoor gym bike as part of a virtual indoor bike trainer (connected to a large TV)

Maybe 3 months ago I have swapped this with an Apple TV (it was a pain to use with a remote keyboard etc) and since then have been using it for running VMWare (it’s running 6.7 currently off a USB stick)

I now have some i7 NUCs and this isn’t needed anymore. Obviously this is now limited to 8GB of RAM which may or may not be an issue. The unit has never missed a beat.

With the fanless case it is not massively heavy, but will not be super cheap to post.

With this in mind I’d be looking for £100 for the unit including postage.

No box, just the NUC, PSU and HDMI cable.

It uses a mini HDMI which I can throw in a cable.


— selling “sold as discussed in thread” —
——————————————-
= 3 =

INTEL NUC NUC5CPYH

Spec:

INTEL NUC NUC5CPYH
– Celeron N3050 @ 1.6GHz
– 8GB DDR3L
– 80GB internal 7200rpm 2.5″ SATA drive
– VGA (HDB15) & HDMI 1.4b
– 4 USB3 & SDXC
– Windows 10 Home License (ESD)

Full specs at:


Intel® NUC Kit NUC5CPYH

In good condition and all works as expected. It has a small internal fan, but CPU is only 6W so very quiet (almost silent). Sorry no box, just the NUC and PSU.

Looking for around £95 including delivery.

— moved to Ebay —
——————————————-
= 4 =
Intel i7 nuc Skull Canyon,
32gb RAM
250GB V-Nand 970 EVO SSD

Warranty until September 2020 via
Warranty Information
Note that I have had a shuffle around as have three of these and now only require one, so two are for sale. The previously mentioned one with damaged screws I am keeping as it doesn’t bother me.

This one has no issues at all and comes with Windows 10 Pro installed (licensed via ESD). If you need to reload (using Microsoft’s media creation tool and a USB stick) it will re-licence when it connects to the internet.

Sold as just the NUC and Power Supply, sorry no box but will be very well packed.

For a complete technical specification please see:

Intel® NUC Kit NUC6i7KYK Product Specifications

£525 including P&P

——————————————-

——————————————-

Price and currency: Various
Delivery: Delivery cost is included within my country
Payment method: PPG or BT
Location: BRISTOL
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

Breaking Bard: Using Microsoft AI to unlock Shakespeare’s greatest works

Spoiler alert: At the end of Romeo and Juliet, they both die.

OK, as spoilers go, it’s not big. Most people have read the play, watched one of the famous films or sat through countless school lessons devoted to William Shakespeare and his work. They know it doesn’t end well for Verona’s most famous couple.

In fact, the challenge is finding something no one knows about the world-famous, 300-year-old play. That’s where artificial intelligence can help.

Phil Harvey, a Cloud Solution Architect at Microsoft in the UK, used the company’s Text Analytics API on 19 of The Bard’s plays. The API, which is available to anyone as part of Microsoft’s Azure Cognitive Services, can be used to identify sentiment and topics in text, as well as pick out key phrases and entities. This API is one of several Natural Language Processing (NLP) tools available on Azure.

By creating a series of colourful, Power BI graphs (below) showing how negative (red) or positive (green) the language used by The Bard’s characters was, he hoped to shine a new light on some of the greatest pieces of literature, as well as make them more accessible to people who worry the plays are too complex to easily understand.

Harvey said: “People can see entire plotlines just by looking at my graphs on language sentiment. Because visual examples are much easier to absorb, it makes Shakespeare and his plays more accessible. Reading language from the 16th and 17th centuries can be challenging, so this is a quick way of showing them what Shakespeare is trying to do.

“It’s a great example of data giving us new things to know and new ways of knowing it; it’s a fundamental change to how we process the world around us. We can now pick up Shakespeare, turn it into a data set and process it with algorithms in a new way to learn something I didn’t know before.”

What Harvey’s graphs reveal is that Romeo struggles with more extreme emotions than Juliet. Love has a much greater effect on him challenging stereotypes of the time that women – the fairer sex – were more prone to the highs and lows of relationships.

“It’s interesting to see that the male lead is the one with more extreme emotions,” Harvey added. “The longest lines, both positive and negative, are spoken by him. Juliet is steadier; she is positive and negative but not extreme in what she says. Romeo is a fellow of more extreme emotion, he’s bouncing around all over the place.

Macbeth is also interesting because there are these two peaks of emotion, and Shakespeare uses the wives at these points to turn the story. I also looked at Helena and Hermia in A Midsummer Night’s Dream, because they have a crossed-over love story. They are both positive at the start but then they find out something and it gets negative towards the end.”

The project required AI working alongside humans to truly understand and fully appreciate Shakespeare’s plays

His Shakespeare graphs are the final step in a long process. After downloading a text file of The Bard’s plays from the internet, Harvey had to process the data to prepare it for Microsoft’s AI algorithms. He removed all the stage directions, keeping the act and scene numbers, the characters’ names and what they said. He then uploaded the text to Microsoft Cognitive Services API, a set of tools that can be used in apps, websites and bots to see, hear, speak, understand and interpret users through natural methods of communication.

The Text Analytics API is pre-trained with an extensive body of text with sentiment associations. The model uses a combination of techniques during text analysis, including text processing, part-of-speech analysis, word placement and word associations.

After scanning the Shakespeare plays, Microsoft’s NLP tool gave the lines of dialogue a score between zero and one – scores close to one indicated a positive sentiment, and scores close to zero indicated a negative sentiment.

However, before you start imagining a world in which only robots read books before telling humans the gist of what happened, Harvey discovered some unexpected challenges with his test.

While the AI system worked well for Shakespeare plays that contained straightforward plots and dialogue, it struggled to determine if more nuanced speech was positive or negative. The algorithm couldn’t work out whether Hamlet’s mad ravings were real or imagined, whether characters were being deceptive or telling the truth. That meant that the AI labelled events as positive when they negative, and vice-versa. The AI believed The Comedy of Errors was a tragedy because of the physical, slapstick moments in the play.

Everything you need to know about Microsoft’s cloud

Harvey realised that the parts of the plays that dealt with what truly makes us unique as humans – joking, elation, lying, double meanings, subterfuge, sarcasm – could only be noticed and interpreted by human readers. His project required AI working alongside humans to truly understand and fully appreciate Shakespeare.

Harvey insists that his experiments with Shakespeare’s plays are just a starting point but that the same combination of AI and humans can eventually be extended to companies and their staff, too.

“Take the example of customers phoning their energy company,” he said. “With Microsoft’s NLP tools, you could see if conversations that happen after 5pm are more negative than those that happen at 9am, and deploy staff accordingly. You could also see if a call centre worker turns conversations negative, even if they start out positive, and work with that person to ensure that doesn’t happen in the future.

“It can help companies engage with data in a different way and assist them with everyday tasks.”

Harvey also said journalists could use the tool to see how readers are responding to their articles, or social media experts would get an idea of how consumers viewed their brand.

For now, Harvey is concentrating on the Classics and is turning his attention to Charles Dickens, if he can persuade the V&A in London to let him study some of their manuscripts.

“In the V&A manuscripts, you can see where Dickens has crossed out words. I would love to train a custom vision model on that to get a page by page view of his changes. I could then look at a published copy of the text and see which parts of the book he worked on most; maybe that part went well but he had trouble with this bit. Dickens’s work was serialised in newspapers, so we might be able to deduce whether he was receiving feedback from editors that we didn’t know about. I think that’s amazing.”

Tags: , , , , , , ,

Go to Original Article
Author: Microsoft News Center

How EHT’s black hole image data is stored and protected

On April 10, 2019, the Event Horizon Telescope published the first black hole image, a historic and monumental achievement that included lots of IT wizardry.

The Event Horizon Telescope (EHT) includes telescopes spread across the world at eight different sites, including the South Pole. Each site captures massive amounts of radio signal data, which goes to processing centers at the MIT Haystack Observatory in Westford, Mass., and the Max Planck Institute for Radio Astronomy in Bonn, Germany.

The data for the now-famous black hole image — captured in 2017 from galaxy M87, 53 million light-years away — required around 3.5 PB of storage. It then took two years to correlate the data to form an image.

The project’s engineers had to find a way to store the fruits of this astronomy at a cost that was, well, less than astronomical.

EHT’s IT challenges included finding the best way to move petabyte-scale data from multiple sites, acquiring physical media that was durable enough to handle high altitudes and a way to protect all of this data cost-effectively.

The cloud is impractical

Normally, the cloud would be a good option for long-term storage of unifying data sourced from multiple, globally distributed endpoints, which was essentially the role of each individual telescope. However, EHT data scientist Lindy Blackburn said cloud is not a cold storage option for the project.

Each telescope records at a rate of 64 Gbps, and each observation period can last more than 10 hours. This means each site generates around half a petabyte of data per run. With each site recording simultaneously, Blackburn said the high recording speed and sheer volume of data captured made it impractical to upload to a cloud.

Picture of EHT's black hole image
Petabytes of raw radio signal data was processed to form the world’s first black hole image.

“At the moment, parallel recording to massive banks of hard drives, then physically shipping those drives somewhere is still the most practical solution,” Blackburn said.

It is also impractical to use the cloud for computing, said Geoff Crew, co-leader of the EHT correlation working group at Haystack Observatory. Haystack is one of EHT’s two correlation facilities, where a specialized cluster of computers combine and process the radio signal data of the telescopes to eventually form a complete black hole image.

There are about 1,000 computing threads at Haystack working on calculating the correlation pattern between all the telescopes’ data. Even that is only enough to play back and compute the visibility data at 20% of the speed at which the data was collected. This is a bottleneck, but Crew said using the cloud wouldn’t speed the process.

“Cloud computing does not make sense today, as the volume of data would be prohibitively expensive to load into the cloud and, once there, might not be physically placed to be efficiently computed,” Crew said.

Crew added that throwing more hardware at it would help, but time and human hours are still spent looking at and verifying the data. Therefore, he said it’s not justifiable to spend EHT’s resources on making the correlators run faster.

Although Blackburn concluded physically transporting the data is currently the best option, even that choice presents problems. One of the biggest constraints is transportation at the South Pole, which is closed to flights from February to November. The cost and logistics involved with tracking and maintaining a multipetabyte disk inventory is also challenging. Therefore, Blackburn is always on the lookout for another method to move petabyte-scale data.

“One transformative technology for the EHT would be if we could send out raw data directly from the telescopes via high-speed communication link, such as via satellite laser relay, and bypass the need to move physical disks entirely,” Blackburn said. “Another more incremental advancement would be a move to solid-state recording, which would be lighter, faster and more compact. However, the timeline for that would depend entirely on the economics of SSD versus magnetic storage costs.”

Chart explaining interferometry
Capturing a black hole 53 million light-years away requires multiple telescopes working together.

Using helium hard drives

Another problem EHT ran into regarding the black hole image data was the frequency at which traditional hard drives failed at high altitudes. Vincent Fish, a research scientist at Haystack who is in charge of science operations, logistics and scheduling for EHT, said each EHT telescope ranged from 7,000 feet above sea level to 16,000 feet.

“For years, we had this problem where hard drives would fail,” Fish said. “At high altitudes, the density of air is lower, and the old, unsealed hard drives had a high failure rate at high altitudes.”

The industry ended up solving this problem for us, and not because we specifically asked them to.
Vincent FishResearch scientist, MIT Haystack Observatory

The solution came in the form of helium hard drives from Western Digital’s HGST line. Hermetically sealed helium drives were designed to be lighter, denser, cooler and faster than traditional hard drives. And because they were self-contained environments, they could survive the high altitudes in which EHT’s telescopes operated.

“The industry ended up solving this problem for us, and not because we specifically asked them to,” Fish said.

EHT first deployed 200 6 TB helium hard drives in 2015, when it was focused on studying the black hole at Sagittarius A* (pronounced Sagittarius A-Star). Blackburn said EHT currently uses about 1,000 drives, some of which have 10 TB of capacity. It also has added helium drives from Seagate and Toshiba, along with Western Digital.

“The move to helium-sealed drives was a major advancement for the EHT,” Blackburn said. “Not only do they perform well at altitude and run cooler, but there have been very few failures over the years. For example, no drives failed during the EHT’s 2017 observing campaign.”

No backup for raw data

After devising a way to capture, store and process a massive amount of globally distributed data, EHT had to find a workable method of data protection. EHT still hasn’t found a cost-effective way to replicate or protect the raw radio signal data from the telescope sites. However, once the data has been correlated and reduced to about tens of petabytes, it is backed up on site on several different RAID systems and on Google Cloud Storage.

“The reduced data is archived and replicated to a number of internal EHT sites for the use of the team, and eventually, it will all be publically archived,” Crew said. “The raw data isn’t saved; we presently do not have any efficient and cost-effective means to back it up.”

Most of our challenges are related to insufficient money, rather than technical hurdles.
Geoff CrewCo-leader of the EHT correlation working group, MIT Haystack Observatory

Blackburn said, in some ways, the raw data isn’t worth backing up. Because of the complexity of protecting such a large amount of data, it would be simpler to run another observation and gather a new set of data.

“The individual telescope data is, in a very real sense, just ‘noise,’ and we are fundamentally interested only in how much the noise between telescopes correlates, on average,” Blackburn said. “Backing up original raw data to preserve every bit is not so important.”

Backing up the raw data for the black hole image may become important if EHT ends up sitting on it for long periods of time as a result of the computational bottlenecks, Blackburn admitted. However, he said he can’t seriously consider implementing a backup process unless it is “sufficiently straightforward and economical.”

Instead, he said he’s looking at where technology might be in the next five or 10 years and determining if recording to hard drives and shipping them to specialized processing clusters will still be the best method to handle petabyte-scale raw data from the telescopes.

“Right now, it is not clear if that will be continuing to record to hard drives and using special-purpose correlation clusters, recording to hard drives and getting the data as quickly as possible to the cloud, or if SSD or even tape technology will progress to a point to where they are competitive in both cost and speed to hard disks,” Blackburn said.

Fish suggested launching a constellation of satellites via spacecraft rideshare initiatives, either through NASA or a private company, isn’t entirely out of reach, either. Whether it’s the cloud or spaceships, the technology to solve EHT’s petabyte-scale problem exists, but cost is the biggest hurdle.

“Most of our challenges are related to insufficient money, rather than technical hurdles,” Crew said.

Go to Original Article
Author:

For Sale – Skull Canyon NUC + Other NUCs

——————————————-
= 1 =
This is up for sale bought from here..

For Sale – Intel NUC8i7HVK NUC w/500Gb Evo 970 and 16Gb DDR

I bought 32gb of RAM as it was needed for the project, so you have an option of 16 or 32gb RAM

£715 with 16gb Ram SOLD

Both prices include postage
Will post pictures later this evening.

——————————————-
= 2 =
I have an i5 based NUC in a Fanless Tranquil case.
The NUC is a D54250WYK and specs can be found here:

Intel® NUC Kit D54250WYK Product Specifications

Tranquil PC H22 Fanless NUC Chassis

And here is the info about the other parts..

  • I5 Intel Haswell NUC – Intel® Core™ i5-4250U Processor (3M Cache, up to 2.60 GHz)
  • Tranquil PC H22 Fanless NUC Chassis (supports additional 2.5″ HDD)
  • 8GB G.Skill Ripjaws DDR3 2133MHz SO-DIMM Low-voltage 1.35V laptop memory kit (1x 8GB) CL11
  • Samsung 840 EVO 120GB mSATA Solid State Drive
  • Intel Dual Band Wireless-AC 7260 Wi-Fi and Bluetooth card
  • Windows 10 Home License (ESD)


History: I bought this off someone here and it originally had 16GB of RAM, but on arrival it didn’t recognise the full 16GB of RAM.
I removed the memory one by one and found that it wasn’t recognising one of the slots, so removed the RAM and have been using it with 8GB ever since.

I have been using this mainly for occasional use to run Windows 10 for my indoor gym bike as part of a virtual indoor bike trainer (connected to a large TV)

Maybe 3 months ago I have swapped this with an Apple TV (it was a pain to use with a remote keyboard etc) and since then have been using it for running VMWare (it’s running 6.7 currently off a USB stick)

I now have some i7 NUCs and this isn’t needed anymore. Obviously this is now limited to 8GB of RAM which may or may not be an issue. The unit has never missed a beat.

With the fanless case it is not massively heavy, but will not be super cheap to post.

With this in mind I’d be looking for £100 for the unit including postage.

No box, just the NUC, PSU and HDMI cable.

It uses a mini HDMI which I can throw in a cable.


— selling “sold as discussed in thread” —
——————————————-
= 3 =

INTEL NUC NUC5CPYH

Spec:

INTEL NUC NUC5CPYH
– Celeron N3050 @ 1.6GHz
– 8GB DDR3L
– 80GB internal 7200rpm 2.5″ SATA drive
– VGA (HDB15) & HDMI 1.4b
– 4 USB3 & SDXC
– Windows 10 Home License (ESD)

Full specs at:


Intel® NUC Kit NUC5CPYH

In good condition and all works as expected. It has a small internal fan, but CPU is only 6W so very quiet (almost silent). Sorry no box, just the NUC and PSU.

Looking for around £95 including delivery.

— moved to Ebay —
——————————————-
= 4 =
Intel i7 nuc Skull Canyon,
32gb RAM
250GB V-Nand 970 EVO SSD

Warranty until September 2020 via
Warranty Information
Note that I have had a shuffle around as have three of these and now only require one, so two are for sale. The previously mentioned one with damaged screws I am keeping as it doesn’t bother me.

This one has no issues at all and comes with Windows 10 Pro installed (licensed via ESD). If you need to reload (using Microsoft’s media creation tool and a USB stick) it will re-licence when it connects to the internet.

Sold as just the NUC and Power Supply, sorry no box but will be very well packed.

For a complete technical specification please see:

Intel® NUC Kit NUC6i7KYK Product Specifications

£525 including P&P

——————————————-

——————————————-

Price and currency: Various
Delivery: Delivery cost is included within my country
Payment method: PPG or BT
Location: BRISTOL
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

Introduction to Azure Network Security Groups (NSGs)

Microsoft Azure and other public clouds are changing the way which enterprises deploy and secure their distributed services. One of the main benefits of deploying in the public cloud is the ability to quickly allows users or applications to connect to your service from anywhere in the world, providing them with a scalable and highly-availability virtual networking infrastructure. Since these networks are the entry points to your application, they should be the first line of defense from threats, only accepting traffic from users, applications or protocols which have been explicitly approved. Securing these networks can be challenging because they may contain diverse virtual appliances and a dynamic network infrastructure while not giving admins access to any of the underlying physical networking infrastructure.

Microsoft’s solution to simplify virtual network security is through a management layer known as a Network Security Group (NSG) which allows administrators to easily organize, filter and route different types of network traffic. Any Azure virtual network can be placed into a security group where different inbound and outbound rules can be configured to allow or deny certain types of traffic. This blog will review some of the capabilities and best practices for Azure NSGs.

Creating a Network Security Group (NSG)

Microsoft Azure provides a simple interface to create the Network Security Group from both a modern (recommended) and “classic” view. From the Network Security Group interface, it is easy to add a new security group, where you will specify the name, subscription, Azure resource group, and location where it will be configured. The following screenshot shows the creation of an Azure NSG from the modern interface.

Network Security Group in Microsoft Azure

Figure 1 – Creating a new Azure Network Security Group (NSG)

Network Security Group Rules

After creating this NSG, you will have the ability to manage its individual rules. A rule is used to define whether the network traffic is safe and should be permitted through the network, or denied.

A rule consists of the following components:

  • Name – A unique name which should be easy for administrators to use to find the rule.
  • Priority – This is an integer between 100 and 4096, which should be unique. This value defines the processing order of the rule, with rules containing lower values (higher priority) being executed first.
  • Source or destination – This field indicates which application or user(s) the rule is applicable for. This can be an IP Address, IP Address range or Azure resource.
  • Protocol – The TCP, UDP or ICMP protocol which will be analyzed.
  • Direction – This indicates whether the traffic is inbound or outbound.
  • Port Range – This will specify which port or range of ports the rule is applicable for.
  • Action – Setting either Allow (the traffic through) or Deny (and block the traffic) will specify the action to be taken by the NSG when network traffic matching the rule is identified.

Whenever traffic is permitted, a record is created to keep track of the network traffic, and these records can be used by network traffic analytics tools for further threat inspection and analysis. A best practice for network security rules is to start by denying all traffic, and then creating rules only for traffic which is known to be safe. Microsoft Azure automatically creates a few default rules in each Network Security Group, including:

Inbound Priority Source Source ports Destination Destination ports Protocol Access
AllowVNetInBound 65000 VirtualNetwork 0-65535 VirtualNetwork 0-65535 All Allow
AllowAzureLoad
BalancerInBound
65001 AzureLoad
Balancer
0-65535 0.0.0.0/0 0-65535 All Allow
DenyAllInbound 65500 0.0.0.0/0 0-65535 0.0.0.0/0 0-65535 All Deny
Outbound Priority Source Source ports Destination Destination ports Protocol Access
AllowVnetOutBound 65000 VirtualNetwork 0-65535 VirtualNetwork 0-65535 All Allow
AllowInternet
OutBound
65001 0.0.0.0/0 0-65535 Internet 0-65535 All Allow
DenyAllOutBound 65500 0.0.0.0/0 0-65535 0.0.0.0/0 0-65535 All Deny

The following screenshot shows a network security group for a database resource, and you can see that all inbound and outbound traffic is explicitly denied, with the exception of a few rules.

A production Network Security Group with its rules configured

Figure 2 – A production Network Security Group with its rules configured

Service Tags & Application Security Groups

Once you begin using NSGs, you will likely find that managing the IP Addresses at scale becomes challenging, requiring the creation and management of many rules. To simplify this, Microsoft Azure introduced a concept of a “service tag” which is a pre-defined collection of IP Addresses associated with a specific resource. Currently there are service tags available for Azure virtual networks, load balancers, cloud, traffic manager, storage, SQL, Cosmos DB, key vault, event hub, service bus, container registry, app service, app service management, API management, connectors, gateway manager, data lake, Active Directory, monitor, service fabric and machine learning resources. A current list from Microsoft is available here.

Microsoft Azure also allows the security groups to be managed at the application-level, further simplifying management by abstracting the IP address(es) from an application. This means that an Azure application may be used in a rule as a source or destination. It is a best practice to use either service tags or application security groups to simplify management.

Other Network Security Group Tips

When planning your NSGs, you only have to worry about the IP Address which are assigned to your business, which means that you can ignore anything assigned to an Azure infrastructure service, such as DNS, DHCP, etc. Similarly, if you are using load-balancers, you only need to worry about the source and destination of the computer or service, not the IP Addresses used the load balancer itself. You must also make sure that the any VM within the security group has a valid license for its guest operating system. Finally, be careful about denying all outbound internet traffic for VMs which use extensions, as these extensions may become blocked, causing the VM to appear like they are frozen in an ‘updating’ state.

Wrap-Up

In conclusion, you should love Azure network security groups for their ability to let you quickly and easily manage network security. While configuration may be tedious at first, by taking advantage of service tags and application security groups, you can streamline the process. Make sure that NSG planning and management is considered with your standard Azure operating procedures moving forward to help secure and protect your Microsoft cloud infrastructure.

What about you? Have you tried using NSGs in Azure yet? Do you have any questions or concerns? Let us know in the comments form below!

Thanks for reading!

Go to Original Article
Author: Symon Perriman

Village & Pillage out today on Bedrock

Download it for Xbox One, Windows 10 Edition, iOS, Android and Nintendo Switch today!

Did it take a village to finish Minecraft’s biggest update yet? Oh pah-lease. It took so much more! It took hardworking developers, pixel artists, that wonderful person who keeps the office fridge stocked, far too many employees to thank here, and most of all – you! – the players, giving us your constant feedback every step of the way. You helped make Village & Pillage an update we couldn’t be more proud of.

Best of all, we get to release it today! Xbox One, Windows 10 Edition, iOS, Android and Nintendo Switch players will find the update ready to be installed either right now, very soon, or in the recent past – in fact, why not check your chosen Minecraft platform now and see if you’re ready to join all new-villages, fight all-new mobs, and discover tons more all-new all-great features!

(Playing Minecraft: Java? You get Village & Pillage today too! Click this lush line of emerald green text to find out more).

You’ll find a full changelog below, but while you’re still up here, I’ll highlight some of my favourite new features:

  • Pillagers! What can I say? I love a good scrap, and so do the newest mob to enter Minecraft, even if they’re not too bright. Quick, craft a brand-new crossbow before they attack you with theirs!
  • The Wandering Trader! A mysterious merchant who travels the lands in search of sales. Sure, I could waste more of my life telling you all about this sales savant, but why bother when I can just link you to this ace video all about the Wandering Trader instead? Or this brilliant article written by Per ‘you better mention I also worked on that video’ Landin?
  • New villager trades! You’ll find villagers busy beavering away at their new jobs in each village you visit, and we’ll have a deep dive into all their new occupations this Saturday!

Keep checking Minecraft on Xbox One, Windows 10 Edition, iOS, Android and Nintendo Switch for the update, and enjoy!

Want to chat with us about the update? Join us on the official Minecraft Discord at discord.gg/Minecraft

Known Issue: some mobs are bugged and might look like new villagers in some Marketplace maps. We’re working on a hotfix and we want to get it out to you as soon as we can – please don’t open these worlds until then. See aka.ms/brokenworlds for more information.

Please note that there is a bug that currently affects HD texture packs on mobile devices. Using these texture packs may cause your game to crash. Until we get this bug fixed, we advise you to not use HD texture packs on mobile devices. We hope to get this bug fixed soon!

FULL CHANGELOG

Village

  • Updated Villages

    • Many new building types and enhanced village generation

    • Biome specific architecture for plains, desert, savannah, and taiga

  • New Villagers

    • Villagers have new clothing to indicate their level, profession, and biome

    • Added Mason and Nitwit villagers

    • Villagers now sleep in beds

    • Villagers now visit their job sites during the day and go home at night

    • Greatly improved villager pathfinding

    • Villagers in existing worlds will convert to new villagers (if they are not part of a template world)

    • Zombie Villagers now have biome-specific and profession skin layers

  • Village Job Sites

    • Villagers can now take on a new profession when near a job site block

    • While villagers claim these sites, they also have functions for players

    • Cartography Table – Provides an easier way to copy and enlarge maps. Maps can be locked by using glass panes

    • Grindstone – Used to repair weapons and tools, plus disenchanting

    • Barrel – Stores items like a chest but can still be opened with blocks on top of it

    • Smoker – Cooks food much faster than a furnace

    • Blast Furnace – Faster ore smelting

    • Composter – Adding enough crops will produce bone meal

    • Stonecutter – Easy crafting for stone and cobblestone items

    • Smithing Table and Fletching Table – Functionality coming in a later update

  • Villager Trading

    • Added brand new villager trades (169044)

    • Villagers now have a visual based trading system and will hold up an item they wish to trade if the player is holding something they want

    • When villagers make trades, they gain experience. When they gain enough experience, they level up. Leveling up unlocks new trades

    • Villagers will resupply their trades when arriving at their job site (172559)

  • Wandering Trader

    • A villager mob that will appear at a village’s gathering site periodically and stays for a period of 2-3 game days

    • This trader offers items from a wide variety of different biomes, random dyes, and other rare materials

    • Accompanied on their journey by two fancy llamas!

  • Bells

    • When rung, all villagers will run into their houses

    • Bells ring when players interact with them or are powered by redstone

Pillage

  • Pillager Outposts

    • The new tower hangout for pillagers that generate in the same biomes as villages

    • Pillagers will respawn around the tower

    • Clear them out and score some loot!

  • Illager Captain

  • Raids

    • When a player enters a village with Bad Omen, a raid will be triggered

    • Pillager enemies will attack a village in waves

    • Players that successfully defend a village from a raid will receive the Hero of the Village effect, giving a steep discount on trades with villagers

  • Ravager

    • A powerful, new enemy mob found in illager patrols and during village raids

    • When running, it can destroy some blocks like crops so watch out!

    • Can be ridden into battle by illagers

  • Pillager Patrols

 

New Features

  • Campfire

    • A new light source to cozy up your village

    • Works great as a fireplace in a home, with no fire spread to worry about

    • Throw some food on it and become a campfire cooking pro!

  • Sweet Berries

  • Bamboo Jungle

  • New Achievements

    • Plethora of Cats – Befriend twenty stray cats (20G)

    • Kill the Beast! – Defeat a Ravager (30G)

    • Buy Low, Sell High – Trade for the best possible price (50G)

    • Disenchanted – Use a Grindstone to get experience from an enchanted item (20G)

    • We’re being attacked – Trigger a Pillager Raid (20G)

    • Sound the Alarm! – Ring the bell with a hostile enemy in the village (20G)

    • I’ve got a bad feeling about this – Kill a Pillager Captain (20G)

  • Roaming Skin Choice

    • When choosing a skin from a skin pack, the selected skin will now be selected automatically on other Bedrock devices using the same account

    • Some skin packs may not be eligible for roaming selection

  • Accessibility Features

    • Text to Speech can now be enabled to read in-game chat

    • UI Screen Reader can be enabled to say the name of interface controls and their current state

    • Accessibility features can be enabled in Settings

Changes

  • Note on World Generation: In order to deliver the coolest generated villages possible, some world seeds may have villages generate in different areas than they used to before this update

  • A fresh new batch of seeds are now available in the Seed Picker when creating a new world

  • Added even more new textures to blocks and items, including stained glass

  • Increased the amount of scaffolding that can be placed out from its initial support

  • Changes to the way cats spawn in villages:

    • Cats now respawn based on number of beds in the village

    • The number of cats = 1/4 the number of beds

    • Cat total caps at 10 cats per village

  • Lecterns now emit a redstone signal when turning pages

  • Darkened portions of the game’s menus to provide stronger contrast for accessibility

Fixes

Go to Original Article
Author: Microsoft News Center

Pointers on becoming a transformational CIO, pitfalls to avoid

What does it mean to be a transformational CIO?

The question was raised by industry analyst Michael Krigsman at a CIO event last month at Boston College. There to answer was a panel steeped in the world of CIOs: two former CIOs turned consultants, a research scientist whose field is business transformation and a high-profile CIO with a history of taking on tough IT assignments.

One widely shared opinion: The role is not defined by technology or circumscribed by digital transformation — at least not as that term is typically defined.

“Most of the time, we’re talking about digital transformation in a vacuum. It’s digital for digital’s sake, but it isn’t tied to a specific outcome that brings business benefit,” said consultant Tim Crawford, CIO strategic adviser at AVOA in Rolling Hills Estates, Calif., and host of the “CIO In the Know” podcast. Business benefit is the North Star of the digital transformation, he said — “not tech for tech’s sake.”

Here are some of the panel’s key takeaways on what goes into being a transformational CIO.

Navigating organizational change

Stephanie Woerner, research scientist, MIT Sloan Center for Information Systems ResearchStephanie Woerner

Stephanie Woerner, a research scientist at the MIT Sloan Center for Information Systems Research, agreed the technology overhaul usually required for digital transformation doesn’t happen in a vacuum.

“Digital transformation is almost a synonym for organizational change,” she said.

The organizational change happens in three realms: at the systems level, in workplace politics and in corporate culture. 

Companies are implementing new technology systems today to help them compete more effectively in a digital economy. The technology alters products and services and how they are delivered — and, consequently, all the decision rights formerly associated with the old business model.

“It’s a lot of political change” that IT and business leaders often fail to pay mind to, she said. The new systems and processes drive cultural change, because they require people to work differently and interact with each other differently.

“The fact that you’re changing who you’re selling to or what you’re selling is going to impact all three of those categories,” Woerner said. 

Transformational CIO as persuader, imaginator-in-chief

Isaac Sacolick, president, StarCIOIsaac Sacolick

Isaac Sacolick, whose career as a CIO included stints at Businessweek and McGraw-Hill, said the essence of being a transformational CIO is getting people to do things differently — and “that is hard to do.”

“In all my roles as CIO, I was at companies under duress — they were really being disrupted,” said Sacolick, now the president of New York-based consultancy StarCIO and author of a book on digital transformation. “As leaders, you need to persuade people to think differently.”

Early adopters are easy to manage in situations where change is urgently needed, he said, as they are, by definition, eager to try new things. As for the laggards, he said companies have ways to deal with this cohort, including firing people.

“Then, there is a middle group, some 70% of the organization, that needs to be convinced,” he said — and not just about what technologies to invest in and how much to invest. “You need people to think about data, about customer experience, about where the organization wants to win.”

Crawford said the transformational CIO is not aiming for the status quo.

“We get into this mode of looking for best practices. But that is somebody else’s best practice, not necessarily yours,” he said. “The way your organization needs to evolve requires imagination.”

Focus on customer experience

Among transformational CIOs, change management skills and imagination are not confined to the IT department.

MIT Sloan’s Woerner said the CIOs her research group is talking to these days “have a vision of what their company can do.” They are CIOs at large companies with big IT infrastructures and big IT projects, she said, but they are not just “services” CIOs.

To be sure, they measure people to ensure IT service is where it needs be. “But that is not where their big value is. They are vital to the business,” Woerner said.

As such, they are knowledgeable about what customers want and what problems they are trying to solve — leaning, if necessary, on partners in the business to educate them on these issues.

Sacolick said he is continually surprised at how many CIOs “are not thinking about the customer and customer experience.

“If you’re not calling up the CMO [chief marketing officer] and asking questions, you’re in trouble,” he said, underscoring that CIOs can’t just be focused on operational efficiency, because they “will run out of runway” on that metric and jeopardize advancing their careers.

Litmus test for CIO as change maker

Panel member Ben Haines, CIO at Verizon Media, whose past experience includes jobs at Yahoo, Box and Pabst Brewing Co., has made a practice of taking on tough IT assignments.

“I guess I’m a sucker for punishment, but there is reward in taking an organization and getting it out of bind,” he said. Calling himself “a builder and not an operator,” he said he hands over the IT operations function to someone else once the rebuilding is done.

Tim Crawford, CIO strategic adviser, AVOATim Crawford

Crawford said a “quick and dirty way” for CIOs to assess their potential as change-makers is to ask themselves if they know the top three topics discussed by their boards of directors.

“If the answer is ‘no,’ you need to stop and step out of your shoes and take a look at where you are and how you are aligning yourself with the rest of the company,” he said.

Crawford said the same litmus test can be applied to the rest of the C-suite: Can CIOs name the top three issues of their CEOs, CFOs, CMOs and so on?

“If you are not familiar with the kinds of conversations they are having, that is usually a good indicator you are not as plugged in as you need to be,” he said.

Go to Original Article
Author:

How will Microsoft role-based certifications affect admins?

The cloud platform product family comprises the first wave of Microsoft’s role-based certifications changes. Before you start to stress that you wasted time and money on certifications, it’s important to understand what the evolving Microsoft certification path means for you and the future of Windows Server and Exchange certifications.

If you’re a Windows Server or Exchange Server administrator interested in certifying your expertise, you probably planned to pursue one or both of Microsoft’s certification credentials. For Windows Server specialists, the company offers the Microsoft Certified Solutions Expert (MCSE): Core Infrastructure certification. For Exchange specialists, Microsoft offers the MCSE: Productivity certification.

But Microsoft’s shift toward role-based certifications shouldn’t affect your current MCSE study path. And although Microsoft will retire MCSE certifications in favor of new role-based ones, your transcript will still reflect your expertise in expired exam topics. It’s worthwhile to continue studying for MCSE titles even with changes on the horizon.

No sign of change for Windows Server and Exchange certifications

Microsoft has not announced any plans for role-based certifications for Windows Server and Exchange yet, nor has the company developed a timeline. As of spring 2019, if you are completing or interested in MCSE certifications, you should stay the course.

Microsoft has not yet slated those MCSE titles for retirement. Enough active exams exist within the certification requirements to offer flexibility for how you certify as an MCSE. Your MCSE will remain on your transcript, regardless of whether it’s listed under active or history, and you can still claim you are an MCSE in Exchange, Windows Server or both.

Transcripts still reflect expired exam certifications

Microsoft role-based certifications represent the way forward for all future Microsoft certifications.

Even though Microsoft has set a new direction for certifications, the ones you have now can still prove to employers that you have specific skills, even though they aren’t current. Individual Microsoft certification exams expire, but Microsoft often offers other elective exams to earn your MCSE.

For example, the Core Infrastructure MCSE program requires you to pass at least one exam in addition to earning your MCSA: Windows Server 2012 and MCSA: Windows Server 2016, but Microsoft has multiple exams to choose from.

The Microsoft Certification Official Transcript keeps track of your exams and certifications in several sections. As of spring 2019, the sections on a transcript include active certifications, exams and certification history.

Figure 1 shows your active certifications. You can see MCSEs in Productivity and Server Infrastructure, now renamed Core Infrastructure, along with the achievement dates. Under the exams section, you can see every Microsoft exam you’ve passed, as well as the achievement date. Note that the exams do not show active or retired status.

Active certifications on a transcript
Figure 1. The Microsoft Certification Official Transcript shows which certifications are active.

The Certification History lists certifications that are inactive by virtue of their exams being retired. For example, Figure 2 shows that Microsoft retired the old Microsoft Certified IT Professional certification for Windows Server 2008.

Inactive certifications
Figure 2. The Certification History shows your inactive certifications.

The new Microsoft role-based certifications have a formal two-year validity period. Microsoft Learning has not yet published the recertification requirements. To keep track of the requirements, check on the Microsoft Learning website for retiring exams and certifications, and check the community blog. If Microsoft Learning publishes role-based badges for Windows Server and Exchange, it will announce the additions on the community blog first.

Why is Microsoft changing its certifications?

Until late 2018, the MCSE in Cloud Platform and Infrastructure validated Microsoft Azure expertise. To earn it, you had to pass three now-retired exams that covered Azure administration, development and architecture.

It’s unusual to have expertise in any two of the above, much less all three. As a response to this rarity, Microsoft created several entry-, associate- and expert-level certifications that distinguish between job roles starting with the cloud platform products, such as Azure, Office 365 and Dynamics 365.

For example, an Azure administrator could pass the AZ-103 exam to earn an associate-level Microsoft Certified Azure Administrator badge. Microsoft Learning partnered with Credly, a digital badge vendor, to distribute shareable, validated badges.

Microsoft Certified: Azure Solutions Architect Expert badge
Figure 3. The new Microsoft role-based certifications use digital badges for verification and sharing.

In Figure 3, you can see the benefits of the new digital badge delivery method, such as:

  1. The shareable badge page lists when you earned the certification and when it expires.
  2. Credly provides code to make it easy to embed your badge on your website or in your social media profiles.
  3. The people you share your badge with can see what skills the credential validates.
  4. The badge page lists the earning requirements for the badge. Click an individual exam to reach that test’s badge page, which is personalized with your own metadata.

The new Microsoft certification path captures the scope of particular job roles and uses a digital badge to make sharing and verification simple. Microsoft role-based certifications represent the way forward for all future Microsoft certifications.

Go to Original Article
Author:

For Sale – Skull Canyon NUC + Other NUCs

——————————————-
= 1 =
This is up for sale bought from here..

For Sale – Intel NUC8i7HVK NUC w/500Gb Evo 970 and 16Gb DDR

I bought 32gb of RAM as it was needed for the project, so you have an option of 16 or 32gb RAM

£715 with 16gb Ram SOLD

Both prices include postage
Will post pictures later this evening.

——————————————-
= 2 =
I have an i5 based NUC in a Fanless Tranquil case.
The NUC is a D54250WYK and specs can be found here:

Intel® NUC Kit D54250WYK Product Specifications

Tranquil PC H22 Fanless NUC Chassis

And here is the info about the other parts..

  • I5 Intel Haswell NUC – Intel® Core™ i5-4250U Processor (3M Cache, up to 2.60 GHz)
  • Tranquil PC H22 Fanless NUC Chassis (supports additional 2.5″ HDD)
  • 8GB G.Skill Ripjaws DDR3 2133MHz SO-DIMM Low-voltage 1.35V laptop memory kit (1x 8GB) CL11
  • Samsung 840 EVO 120GB mSATA Solid State Drive
  • Intel Dual Band Wireless-AC 7260 Wi-Fi and Bluetooth card
  • Windows 10 Home License (ESD)


History: I bought this off someone here and it originally had 16GB of RAM, but on arrival it didn’t recognise the full 16GB of RAM.
I removed the memory one by one and found that it wasn’t recognising one of the slots, so removed the RAM and have been using it with 8GB ever since.

I have been using this mainly for occasional use to run Windows 10 for my indoor gym bike as part of a virtual indoor bike trainer (connected to a large TV)

Maybe 3 months ago I have swapped this with an Apple TV (it was a pain to use with a remote keyboard etc) and since then have been using it for running VMWare (it’s running 6.7 currently off a USB stick)

I now have some i7 NUCs and this isn’t needed anymore. Obviously this is now limited to 8GB of RAM which may or may not be an issue. The unit has never missed a beat.

With the fanless case it is not massively heavy, but will not be super cheap to post.

With this in mind I’d be looking for £100 for the unit including postage.

No box, just the NUC, PSU and HDMI cable.

It uses a mini HDMI which I can throw in a cable.


— selling “sold as discussed in thread” —
——————————————-
= 3 =

INTEL NUC NUC5CPYH

Spec:

INTEL NUC NUC5CPYH
– Celeron N3050 @ 1.6GHz
– 8GB DDR3L
– 80GB internal 7200rpm 2.5″ SATA drive
– VGA (HDB15) & HDMI 1.4b
– 4 USB3 & SDXC
– Windows 10 Home License (ESD)

Full specs at:


Intel® NUC Kit NUC5CPYH

In good condition and all works as expected. It has a small internal fan, but CPU is only 6W so very quiet (almost silent). Sorry no box, just the NUC and PSU.

Looking for around £95 including delivery.

— moved to Ebay —
——————————————-
= 4 =
Intel i7 nuc Skull Canyon,
32gb RAM
250GB V-Nand 970 EVO SSD

Warranty until September 2020 via
Warranty Information
Note that I have had a shuffle around as have three of these and now only require one, so two are for sale. The previously mentioned one with damaged screws I am keeping as it doesn’t bother me.

This one has no issues at all and comes with Windows 10 Pro installed (licensed via ESD). If you need to reload (using Microsoft’s media creation tool and a USB stick) it will re-licence when it connects to the internet.

Sold as just the NUC and Power Supply, sorry no box but will be very well packed.

For a complete technical specification please see:

Intel® NUC Kit NUC6i7KYK Product Specifications

£525 including P&P

——————————————-

——————————————-

Price and currency: Various
Delivery: Delivery cost is included within my country
Payment method: PPG or BT
Location: BRISTOL
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

Microsoft acquisition of Express Logic adds IoT improvements

Microsoft’s acquisition of Express Logic will give the tech giant access to billions of connected devices and, ultimately, give IT better control over IoT endpoints.

With the acquisition, Microsoft plans to make Express Logic’s real time operating system, ThreadX, available on devices with Azure Sphere, Microsoft’s cloud-based IoT software. The integration will also enable ThreadX-powered devices to connect to Azure IoT Edge, Microsoft’s edge computing platform.

Real-time operating systems, or RTOSes, can be used to process data quickly. As such, they are a crucial part of managing IoT devices, which are often scattered across regions and hard to control. An RTOS can provide more consistency and reliability between an IoT device and its cloud than a general-purpose OS.

Microsoft already had the cloud aspect of IoT covered with Azure and its cloud computing capabilities — both Azure Sphere and IoT Edge were developed in the past year. By adding Express Logic and its RTOS, Microsoft can now control both endpoints of an IoT interaction.

“Microsoft has made a tremendous investment in the Azure platform,” said Ken Klika partner of cloud and IT solutions at Sikich LLP, a technology consultancy in Chicago. “But IoT starts on the devices and endpoints. This Microsoft acquisition is in line with the consistency of having OSes on the devices. It’s the ability to have better connectivity with ThreadX and the Azure platform. Rather than create their own RTOS, it made sense to buy a major player.”

The acquisition, which was announced on April 18, is an example of Microsoft’s investment of $5 billion in four years in IoT, a commitment it made in 2018. 

“Microsoft historically wanted Windows on everything,” Klika said. “But, now, Microsoft is allowing for a consistent conversation on connectivity and interactivity to work within their environment.”

Express Logic has been around for 23 years and has made a name for itself as an embedded OS company. Its ThreadX RTOS is currently running on more than 6 billion devices, giving Microsoft a robust set of IoT devices to connect to Azure. Financial terms of the deal weren’t disclosed.

Microsoft is allowing for a consistent conversation on connectivity and interactivity to work within their environment.
Ken KlikaPartner of cloud and IT solutions, Sikich LLP

One of the differentiators that Express Logic brings to Microsoft is its RTOS has a small footprint of 2 KB of instruction area and 1 KB of RAM. That means it can work on complex IoT devices, such as medical devices or manufacturing equipment, as well as constrained devices, which have limited resources. Constrained devices are typically battery-powered, have less than 64 KB of flash memory, and can include smart lightbulbs and smart thermostats.

“While we recommend Azure Sphere for customers’ most secured connections to the cloud, where Azure Sphere isn’t possible in highly constrained devices, we recommend Express Logic’s ThreadX RTOS over other RTOSes in the industry,” Sam George, head of Azure IoT at Microsoft, wrote in the blog post.

Microsoft, which declined to comment directly on the acquisition, remains tight-lipped about its future with Express Logic. However, Klika said he sees a scenario in which Microsoft offers manufacturers and developers an end-to-end platform for IoT devices.

“Microsoft is looking at this as an opportunity to expand the platform and let creative developers come up with applications,” Klika said. “If you think about Microsoft’s growth into open source and supporting the developer community, they’ve pivoted 180 degrees from where they used to be.”

Additionally, Klika said he sees developments like the Microsoft acquisition of Express Logic spurring the adoption of IoT for organizations.

“Organizations are still trying to be creative about how to leverage the concept of IoT and go beyond basic data collection,” Klika said. “If [Microsoft] can package solutions at a lower cost and develop a service around this, it can be exciting.”

Go to Original Article
Author: