Tag Archives: Part

For Sale – (Nearly New) Latest Apple MacBook Air 13.3″ Laptop, 256GB – Model No : A1466 – RRP: £1,099.00

The 2017 13.3″ MacBook Air (Model No: A1466) (Part No: MQD42B/A) which is the latest was bought new in May this year as an intended gift. The battery cycle count is 17 as I used it briefly to see if I would like one over instead of my MacBook Pro and to install some applications/music from my MacBook Pro it but has now been factory reset and unused for a couple of months as I use my MacBook Pro & iMac much more.

100% mark/scratch free on MacBook Air and zero dead pixels. The MacBook Air box and accessories are in perfect condition also. Full Apple warranty until May 2019.

The 13″ MacBook Air has the follow specs and is the current highest spec one on sale on the Apple store (without customisation):

  • 1.8GHz dual-core Intel Core i5 processor with turbo Boost up to 2.9GHz
  • 8GB 1600MHz LPDDR3 memory
  • 256GB SSD storage
  • Intel HD Graphics 6000

£850 collected or £850 plus delivery if required.

No offers! RRP: £1,099 from the Apple Store:

13-inch MacBook Air

Price and currency: £850 + Delivery
Delivery: Delivery cost is not included
Payment method: BT/PPG
Location: South Cumbria
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Oracle Autonomous Database Cloud gets transaction processing

Oracle is now offering transaction processing capabilities as part of its Autonomous Database Cloud software platform, which is designed to automate database administration tasks for Oracle users in the cloud.

The vendor launched a new Oracle Autonomous Transaction Processing (ATP) cloud service, expanding on the data warehouse service that debuted in March as the first Autonomous Database Cloud offering. The addition of Oracle ATP enables the automated system to handle both transaction and analytical processing workloads, Oracle executive chairman and CTO Larry Ellison said during a launch event that was streamed live.

Ellison reiterated Autonomous Database Cloud’s primary selling point: that automated administration functions driven partly by machine learning algorithms eliminate the need for hands-on configuration, tuning and patching work by database administrators (DBAs).

That frees up DBAs to focus on more productive data management tasks and could lead to lower labor costs for customers, he claimed.

“There’s nothing to learn, and there’s nothing to do,” said Ellison, who also repeated previous jabs at cloud platforms market leader Amazon Web Services (AWS) and previewed the upcoming 19c release of the flagship Oracle Database software that underlies Autonomous Database Cloud.

Cloud success still a test for Oracle

However, while Ellison taunted Amazon for its longtime reliance on Oracle databases and expressed skepticism about his competitor’s ability to execute a reported plan to completely move off of them by 2020, Oracle lags behind not only AWS but also Microsoft and Google in the ranks of cloud platform vendors.

Make no mistake, Oracle still has to prove themselves in the cloud.
Adam Ronthalanalyst, Gartner

“Make no mistake, Oracle still has to prove themselves in the cloud,” Gartner database analyst Adam Ronthal said in an email after the announcement.

And Oracle isn’t starting from a position of strength. Overall, the technology lineup that Oracle currently offers on its namesake cloud doesn’t match the breadth of what users can get on AWS, Microsoft Azure and the Google Cloud Platform, Ronthal said.

But Oracle ATP “helps close that gap, at least in the data management space,” he said.

Together, ATP and the Autonomous Data Warehouse (ADW) service that preceded it “are Oracle coming out to the world with products that are built and architected for cloud,” with promises of scalability, elasticity and a low operational footprint for users, Ronthal said.

Oracle's Larry Ellison speaking at the launch of Oracle Autonomous Transaction Processing
Larry Ellison, Oracle’s executive chairman and CTO, introduces the Autonomous Transaction Processing cloud database service.

The Autonomous Database Cloud services are only available on the Oracle Cloud, and Oracle also limits other key data management technologies to its own cloud platform; for example, it doesn’t offer technical support for its Oracle Real Application Clusters software on other clouds.

In addition, Ronthal noted that it’s typically more expensive to run regular Oracle databases on AWS and Azure than on Oracle’s cloud because of software licensing changes Oracle made last year.

“Oracle is doing everything it can to make its cloud the most attractive place to run Oracle databases,” Ronthal said.

But now the company needs to build some momentum by convincing customers to adopt Oracle ATP and ADW, he added — even if that’s likely to primarily involve existing Oracle users migrating to the cloud services, as opposed to new customers.

Oracle’s autonomous services get a look

Clothing retailer Gap Inc. is a case in point, although the San Francisco company’s use of Oracle databases could grow as part of a plan to move more of its data processing operations to the Oracle Cloud.

For example, Gap is working with Oracle on a proof-of-concept project to convert an on-premises Teradata data warehouse to Oracle ADW, said F.S. Nooruddin, the retailer’s chief IT architect.

That’s a first step in the potential consolidation of various data warehouses into the ADW service, he said. Gap also plans to look closely at Oracle ATP for possible transaction processing uses, according to Nooruddin, who took part in a customer panel discussion during the ATP launch event.

Gap already runs Oracle’s retail applications and Hyperion enterprise performance management software in the cloud.

As the retailer’s use of the cloud expands, the Autonomous Database Cloud technologies could help ensure that all of its Oracle database instances, from test and development environments to production systems, are properly patched and secured, Nooruddin said.

Ellison said Oracle ATP also automatically scales the transaction processing infrastructure allotted to users up and down as workloads fluctuate, so they can meet spikes in demand without paying for compute, network and storage resources they don’t need.

That capability appeals to Gap, too, said Connie Santilli, the company’s vice president of enterprise systems and strategy. Gap’s transaction processing and downstream reporting workloads increase sharply during the holiday shopping season — a common occurrence in the retail industry. But Santilli said Gap had to build its on-premises IT architecture to handle the peak performance level, with less flexibility for downsizing systems when the full processing resources aren’t required.

Cloud costs and considerations for Oracle users

In taking aim at AWS, Ellison again said Oracle would guarantee a 50% reduction in infrastructure costs to Amazon users that migrate to Autonomous Database Cloud — a vow he first made at the Oracle OpenWorld 2017 conference.

Meanwhile, Ellison said Oracle customers can use existing on-premises database licenses to make the switch to Oracle ATP and ADW, avoiding the need to pay for the software again. In such cases, users would continue to pay their current annual support fees plus the cost of their cloud infrastructure usage.

The ATP and ADW services layer the automation capabilities Oracle developed on top of Oracle Database 18c, which Oracle released in February as part of a new plan to update the database software annually. During the ATP launch, Ellison disclosed some details about the planned 19c release and the capabilities it will add to Autonomous Database Cloud.

When databases are upgraded to the 19c-based cloud services, the software will automatically check built-in query execution plans and retain the existing ones if they’ll run faster than new ones, Ellison said. That eliminates the need for DBAs to do regression testing on the plans themselves, he added.

Other new features coming with Oracle Database 19c include the ability to configure Oracle ATP and ADW on dedicated Exadata systems in the Oracle Cloud instead of sharing a multitenant pool of the machines, and to deploy the cloud services in on-premises data centers through Oracle’s Cloud@Customer program.

Oracle’s official roadmap shows 19c becoming available in January 2019, but Ellison claimed that was “worst case” and said the new release may be out before the end of this year.

Meltdown and Spectre disclosure suffered “extraordinary miscommunication”

LAS VEGAS — Despite Google’s own Project Zero being part of the discovery team for the Meltdown and Spectre vulnerabilities, Google itself wasn’t notified until 45 days after the initial report was sent to Intel, AMD and ARM.

Speaking at a panel on Meltdown and Spectre disclosure at Black Hat 2018 Wednesday, Matt Linton, senior security engineer and self-described “chaos specialist” at Google’s incident response team, explained how his company surprisingly fell through the cracks when it came time for the chip makers to notify OS vendors about the vulnerabilities.

“The story of Google’s perspective on Meltdown begins with both an act of brilliance and an act of extraordinary miscommunication, which is a real part of how incident response works,” Linton said during the session, titled “Behind the Speculative Curtain: The True Story of Fighting Meltdown and Spectre.”

Even though Project Zero researcher Jann Horn was part of both the Meltdown and Spectre discovery teams, Linton said, Project Zero never notified Google directly. Instead, the Project Zero group followed strict guidelines for responsible vulnerability disclosure and only notified the “owners” of the bugs, namely the chip makers.

“They feel very strongly in PZ [Project Zero] about being consistent about who they notify and rebuffing criticism that Project Zero gives Google early heads up about bugs and things,” Linton said. “I assure they did not.”

Project Zero notified Intel and the other chip makers about the vulnerabilities on June 1, 2017. It had been previously reported that Google’s incident response team wasn’t looped into the Meltdown and Spectre disclosure process until July, but it wasn’t entirely clear why that was. Linton explained what happened.

“[Project Zero] notified Intel and the other CPU vendors of these speculative execution vulnerabilities and they said a third of the way through the email that ‘We found these, here are the proof of concepts, and by the way, we haven’t told anyone else about this including Google, and it’s now your responsibility to tell anyone you need to tell,’ and somewhere along the line they missed that piece of the email,” he told the audience.

Linton said the CPU vendors began the Meltdown and Spectre disclosure process and started notifying companies that needed to know such as Microsoft, but they apparently believed Google had already been informed because Project Zero was part of the discovery teams. As a result, Google was left out of early stage of the coordinated disclosure process.

“As an incident responder, I didn’t find out about this until mid-July, 45 days after [the chip vendors] discovered it,” Linton said.

The miscommunication regarding Google was just one of several issues that plagued the massive coordinated disclosure effort for Meltdown and Spectre. The panelists, which included Eric Doerr, general manager of the Microsoft Security Response Center, and Christopher Robinson, principal program manager and team lead of Red Hat Product Security Assurance, discussed the ups and down of the complex, seven-month process as well as advice for security researchers and vendors based on their shared experiences.

Editor’s note: Stay tuned for more from this panel on the Meltdown and Spectre disclosure process.

For Sale – Cooler Master Masterliquid ML120L RGB AIO CPU cooler

Brand new and sealed Cooler Master ML120L RGB AIO CPU cooler. Came as part of a bundle but not required in my build as I’ve already got a CPU cooler.

Happy to help with any warranty claims that arise, the warranty is for 2 years.

Delivery at cost, or collection welcomed.

Price and currency: £35
Delivery: Delivery cost is not included
Payment method: Paypal F&F or BT
Location: Addlestone, Surrey
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

For Sale – Cooler Master Masterliquid ML120L RGB AIO CPU cooler

Brand new and sealed Cooler Master ML120L RGB AIO CPU cooler. Came as part of a bundle but not required in my build as I’ve already got a CPU cooler.

Happy to help with any warranty claims that arise, the warranty is for 2 years.

Delivery at cost, or collection welcomed.

Price and currency: £35
Delivery: Delivery cost is not included
Payment method: Paypal F&F or BT
Location: Addlestone, Surrey
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

For Sale – Cooler Master Masterliquid ML120L RGB AIO CPU cooler

Brand new and sealed Cooler Master ML120L RGB AIO CPU cooler. Came as part of a bundle but not required in my build as I’ve already got a CPU cooler.

Happy to help with any warranty claims that arise, the warranty is for 2 years.

Delivery at cost, or collection welcomed.

Price and currency: £35
Delivery: Delivery cost is not included
Payment method: Paypal F&F or BT
Location: Addlestone, Surrey
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

The Complete Guide to Azure Virtual Machines: Part 2

This is part 2 of our Azure Virtual Machines Guide following our previous article Introduction to Azure Virtual Machines. In part 1 we created a virtual network for our VM, now we will create a network security group and finally deploy our VM.

Creating the Network Security Group (NSG)

A network security group is like the firewall for our VM, it is required in order to provide any access to our VM, so it’s important we set this up before deploying one. To create one, we simply select Create a resource on the left-hand side of the Azure management portal and type in “Network Security Group”. We will be presented with the proper blade to create one, so click Create:

Now we need to fill in some fields to create our NSG. For this example I’ll name our NSG “LukeLabNSG”, then we will select the subscription that we want to use this NSG on as well as the resource group. Then we will select the location of the Azure data center that this NSG will be located at. Once everything is filled out we click Create:

We wait for the NSG to deploy and once completed, we can view it by clicking on All Services on the left-hand side and selecting Network Security Groups:

We can now see our new NSG, and we can further configure it by clicking on the name:

We need to assign a subnet to associate this NSG with, select Subnets on the left-hand side:

Now click the Associate button so we can find our subnet and the virtual network that we created in part 1. Remember, we created this when we set up the Virtual Network:

We can now see that we have the LukeLabVnet1 virtual network that we created and the LukeLabSubnet assigned to this network security group. Click Ok to configure:

Select Inbound security rules on the left-hand side. We want to enable RDP access to this VM so that we can connect to it. Also note that for the purpose of this demo we are going to allow RDP access via the public internet, however, for a production environment this is not best practice. In a production environment, you would set up a VPN connection and use RDP over the VPN as it is much more secure. To create our new rule we will select the Add button:

If we wanted to do any sort of advanced configuration of allowing specific ports we could input the information in these fields here, however since we are just doing RDP and it is a common port, Microsoft has already created a list of commonly used ports so that we can easily select enable them. To do this, we will click the basic button at the top:

Now we simply select RDP from the Service drop-down list and the proper information will automatically be filled in. Then we put in a description of the rule and select Add. Also, note that Azure gives us the same warning about exposing RDP to the internet:

Now we’ve set up our NSG, we can finally deploy our VM.

Deploying a Virtual Machine

Now that we have our Virtual Network and Network Security Group created, we are ready to deploy the Virtual Machine. To do this, select the Create a resource button on the left-hand side and type in Windows Server 2016 Datacenter. Select the Windows Server 2016 Datacenter from the list and select Create:

Now we need to fill out the form shown here to configure our Virtual Machine. For the purposes of this demo, I named mine “LukeLabVM01”. You also need to give it a username and password (use a strong password!). We’ll select the resource group and the Azure data center location where this VM will be hosted at. “East US” in this case. Clicking Ok will then bring us to the next step:

Select the compute size of the VM that you would like to deploy. The estimated pricing is on the right-hand side:

NOTE: The pricing shown here is for compute costs only. If you need a more detailed breakdown, take a look at the Azure Pricing Calculator

Now we need to fill in the last set of configuration settings. We need to create an availability set, this is very important to understand because it cannot be changed unless the VM is rebuilt. (I’ll be putting together a future post on working with availability sets, so stay tuned for that!). In this example, we’ve simply created an availability set here during the deployment process and named it LukeLabAS1. We then assign our virtual network and subnet that we created in part 1:

Under Network Security Group, click Advanced and select the NSG that we created in the previous steps. Then click OK to verify the settings:

If all of the settings pass the verification process, we now are given the option to deploy the VM. Click Create and we will need to wait for the VM to finish deploying.

Once the deployment process is finished, we can see the newly created VM under Virtual Machines. Click Start to power on the VM if it is not already running:

Then click on the VM name and select Connect at the top to get connected to the VM:

Azure gives us two options, SSH or RDP. In this demo we will RDP to the VM, so select the RDP tab and click on Download RDP file:

Once the RDP file is downloaded, open it up, select connect and input the credentials that we made when we configured the VM:

Now we have access to our VM, and I’ve verified that the hostname of the VM is the one we specified in the deployment settings by bringing up a command prompt:

Wrap-Up

The flexibility of the cloud allows us to stand up Virtual Machines very quickly and it can be a very advantageous solution for applications that need to scale out on massive levels, or situations where investing in hardware doesn’t make sense due to the longevity of the application. However, there is a steep learning curve when it comes to building and managing cloud resources and being aware of each component is critical to the success of running your workloads in the cloud.

What have your experiences with Azure VMs been like so far? Have you found they fit well in your playbook? Have you experienced difficulties? Have questions? Let us know in the comments section below!

Chief data officer role: Searching for consensus

Big data continues to be a force for change. It plays a part in the ongoing drama of corporate innovation — in some measure, giving birth to the chief data officer role. But consensus on that role is far from set.

The 2018 Big Data Executive Survey of decision-makers at more than 50 blue-chip firms found 63.4% of respondents had a chief data officer (CDO). That is a big uptick since survey participants were asked the same question in 2012, when only 12% had a CDO. But this year’s survey, which was undertaken by business management consulting firm NewVantage Partners, disclosed that the background for a successful CDO varies from organization to organization, according to Randy Bean, CEO and founder of NewVantage, based in Boston.

For many, the CDO is likely to be an external change agent. For almost as many, the CDO may be a long-trusted company hand. The best CDO background could be that of a data scientist, line executive or, for that matter, a technology executive, according to Bean.

In a Q&A, Bean delved into the chief data role as he was preparing to lead a session on the topic at the annual MIT Chief Data Officer and Information Quality Symposium in Cambridge, Mass. A takeaway: Whatever it may be called, the chief data officer role is central to many attempts to gain business advantage from key emerging technologies. 

Do we have a consensus on the chief data officer role? What have been the drivers?

Randy Bean: One principal driver in the emergence of the chief data officer role has been the growth of data.

Randy Bean, CEO, NewVantage PartnersRandy Bean

For about a decade now, we have been into what has been characterized as the era of big data. Data continues to proliferate. But enterprises typically haven’t been organized around managing data as a business asset.

Additionally, there has been a greater threat posed to traditional incumbent organizations from agile data-driven competitors — the Amazons, the Googles, the Facebooks.

Organizations need to come to terms with how they think about data and, from an organization perspective, to try to come up with an organizational structure and decide who would be a point person for data-related initiatives. That could be the chief data officer.

Another driver for the chief data officer role, you’ve noted, was the financial crisis of 2008.

Bean: Yes, the failures of the financial markets in 2008-2009, to a significant degree, were a data issue. Organizations couldn’t trace the lineage of the various financial products and services they offered. Out of that came an acute level of regulatory pressure to understand data in the context of systemic risk.

Banks were under pressure to identify a single person to regulators to address questions about data’s lineage and quality. As a result, banks took the lead in naming chief data officers. Now, we are into a third or fourth generation in some of these large banks in terms of how they view the mandate of that role.

Isn’t that type of regulatory driver somewhat spurred by the General Data Protection Regulation (GDPR), which recently went into effect? Also, for factors defining the CDO role, NewVantage Partners’ survey highlights concerns organizations have about being surpassed by younger, data-driven upstarts. What is going on there?

Bean: GDPR is just the latest of many previous manifestations of this. There have been the Dodd-Frank regulations, the various Basel reporting requirements and all the additional regulatory requirements that go along with classifying banks as ‘too large to fail.’

That is a defensive driver, as opposed to the offensive and innovation drivers that are behind the chief data officer role. On the offensive side, the chief data officer is about how your organization can be more data-driven, how you can change its culture and innovate. Still, as our recent survey finds, there is defensive aspect, even there. Increasingly, organizations perceive threat coming from all kinds of agile, data-driven competitors.

Organizations need to come to terms with how they think about data and, from an organization perspective, to try to come up with an organizational structure and decide who would be a point person for data-related initiatives. That could be the chief data officer.
Randy BeanCEO and founder, NewVantage

You have written that big data and AI are on a continuum. That may be worthwhile to emphasize, as so much attention turns to artificial intelligence these days.

Bean: A key point is that big data has really empowered artificial intelligence.

AI has been around for decades. One of the reasons why it hasn’t gained traction is, in its aspects as a learning mechanism, it requires large volumes of data. In the past, data was only available in subsets or samples or in very limited quantities, and the corresponding learning on the part of the AI was slow and constrained.

Now, with the massive proliferation of data and new sources — in addition to transactional information, you also now have sensor data, locational data, pictures, images and so on — that has led to the breakthrough in AI in recent years. Big data provides the data that is needed to train the AI learning algorithms.

So, it is pretty safe to say there is no meaningful artificial intelligence without good data — without an ample supply of big data.

And it seems to some of us, on this continuum, you still need human judgment.

Bean: I am a huge believer in the human element. Data can help provide a foundation for informed decision-making, but ultimately it’s the combination of human experience, human judgment and the data. If you don’t have good data, that can hamper your ability to come to the right conclusion. Just having the data doesn’t lead you to the answer.

One thing I’d say is, just because there are massive amounts of data, it hasn’t made individuals or companies any wiser in and of itself. It’s just one element that can be useful in decision-making, but you definitely need human judgment in that equation, as well.

Have I Been Pwned integration comes to Firefox and 1Password

Have I Been Pwned has been helping users find out if their data was part of a data breach since 2013, and now the service will be integrated into new products from Mozilla and 1Password.

Troy Hunt, the security expert who created and runs the project, announced the new Have I Been Pwned integration and noted the partnership with Firefox will “significantly expand the audience that can be reached.”

“I’m really happy to see Firefox integrating with HIBP in this fashion, not just to get it in front of as many people as possible, but because I have a great deal of respect for their contributions to the technology community,” Hunt wrote in a blog post. “They’ve also been instrumental in helping define the model which HIBP uses to feed them data without Mozilla disclosing the email addresses being searched for.”

This is a key feature featured in both Mozilla’s new Firefox Monitor and 1Password Watchtower: using Have I Been Pwned integration to allow users to search without disclosing email addresses. Hunt said this privacy feature will work in a similar way to the k-anonymity model used by Have I Been Pwned when searching for passwords.

When searching for passwords, Have I Been Pwned matches the first five characters of a SHA-1 hash, which returns, on average, 477 results per search range in a data set of 500 million records, in order to avoid exposing too much information about the password being queried — the results could include the password being queried, or not, but an attacker would not be able to determine the password being queried on the basis of the results returned. With email addresses, Hunt searches on the first six characters of the hash against the database of over 3 billion email addresses, but he added that this shouldn’t result in less secure searches.

“This number [of breached passwords] will grow significantly over time; more data breaches means more new email addresses means larger results in the range search. More importantly though, email addresses are far less predictable than passwords; as I mentioned earlier, if I was to spy on searches for Pwned Passwords, the prevalence of passwords in the system beginning with that hash can indicate the likelihood of what was searched by,” Hunt wrote. “But when we’re talking about email addresses, there’s no such indicator, certainly the number of breaches each has been exposed in divulges nothing in terms of which one is likely being searched for.”

Have I Been Pwned integration

Mozilla has built Have I Been Pwned integration into its Firefox Monitor tool, which will begin as an invitation-only service. Mozilla plans to invite an initial group of 250,000 people to test the feature on the web beginning next week and do a wider release later on.

1Password will include Have I Been Pwned integration in its Watchtower tool as part of the Breach Report feature. The Breach Report will let users know where an account with a user’s email address may have been compromised; show a list of websites where an item saved in 1Password might have been compromised; and show a list of breaches where a 1Password item was found, but the user has already changed the compromised data.

Currently, 1Password Watchtower is only available on the web, but 1Password expects to eventually add the service to all of its apps.

Airtel CIO targets cutting-edge tech

A major part of every digital transformation is exploring how cutting-edge tech can facilitate the journey. Some companies, like Indian telecom giant Bharti Airtel Ltd., are more capable than others of experimenting with new technologies, affording them a wealth of opportunities for innovation.

In this video from the recent MIT Sloan CIO Symposium, Harmeen Mehta, global CIO and head of digital at Airtel, discusses some of the cutting-edge tech she’s employing at her company — everything from advanced mapping techniques and network digitization to voice computing technology and AI-driven customer offerings.

Editor’s note: This transcript has been edited for clarity and length.

What kind of cutting-edge tech are you using to speed up your company’s digital transformation process?

Harmeen Mehta: Lots of pieces. I think one of the biggest challenges that we have is mapping the intricacies and the inner lanes in India and doing far more than what even Google does. For Google, the streets are of prime importance [when it comes to mapping]. For us, the address of every single house and whether it’s a high-rise building or it’s a flat is very important as we bring different services into these homes. So, we’ve been working on finding very innovative ways to take Google’s [mapping] as a base and make it better for us to be able to map India to that level of accuracy of addresses, houses and floor plans.

Another problem that I can think of where a lot of cutting-edge tech is being used is in creating a very customized contextual experience for the consumer so that every consumer has a unique experience on any of our digital properties. The kind of offers that the company brings to them are really tailored and suited to them rather than it being a general, mass offering. There’s a lot of machine learning and artificial intelligence that’s going into that.

Another one is we’re digitizing a large part of our network. In fact, we’re collaborating with SK Telecom, who we think is one of the most innovative telcos out there, in order to do that. We’re using, again, a lot of machine learning and artificial intelligence there as well, as we bring about an entire digitization of our network and are able to optimize the networks and our investments much better.

Then, of course, I’m loving the new stream that we are creating, which is all around exploring voice as a technology. The voice assistants are getting more intelligent. It gives us a very unique opportunity to actually reach out and bring the digital transformation to a lot of Indians who aren’t as literate — to those whom the reading and the writing part doesn’t come to them as naturally as speaking does. It’s opening up a whole lot of new doors and we’re really finding that a very interesting space to work in and we’re exploring a lot in that arena at the moment.

View All Videos