Tag Archives: make

For Sale – PC2-6400 & PC3-8500 Sodimm memory

Free to a good ( or bad) home..

I’ve had to upgrade laptops to make the Win 10 install in them reasonably useable, and due to how the machines came with and what they support I’ve had to swap some modules out.

Can anyone make use of these?? They were working when removed and should be fine but obviously no warrant on them.

One of 2GB PC2-6400S

One of 2GB PC2-8500S

Two of 1GB PC2-8500S

I’ll even cover UK post but if they help you out please consider donating a few pounds to your favourite charity. Donation is entirely optional, and no I don’t want to know if you do or don’t donate.

Go to Original Article

D2iQ latest to seek Kubernetes management niche

Following the breakup and restructuring of Docker, D2iQ, the relaunched version of Mesosphere, looks to make its own pivot to Kubernetes management while avoiding its competitor’s fate.

D2iQ’s enterprise Kubernetes distribution, Konvoy, was released when the company re-launched in August, and this week it added a multi-cloud container management product, Kommander. This new product is similar to Red Hat OpenShift, which integrates other open source infrastructure automation projects such as the Istio service mesh alongside Kubernetes, and offers governance and security features such as secrets management.

D2iQ also said it will build its own CI/CD tool called Dispatch into Kommander. Few details are available about the underpinnings for Dispatch yet, and whether it, too, will compete with Kubernetes-affiliated CI/CD tools such as Spinnaker and Tekton, but its overall mission, according to D2iQ co-founder and CTO Tobi Knaup, is to support enterprise GitOps workflows.

The leaders of the company, founded in 2013, know what they’re up against in a market crowded with competitors that’s already begun to see vendor attrition — namely, they’re up against Google, IBM-Red Hat and smaller competitors such as Rancher. SearchITOperations caught up with Knaup this week to discuss D2iQ’s plans to stay relevant in the cutthroat Kubernetes management market.

I’m sure you’re aware of the huge news this week about Docker Enterprise being acquired by Mirantis. How does D2iQ avoid that same fate as it also pivots to Kubernetes?

Tobi Knaup: It is a space where size matters to some degree. To really make products enterprise-grade, it takes big investment. There’s a list that the CNCF publishes of all the different Kubernetes distributions, and I think there are close to 100 — a lot of them smaller companies. We’re very confident that we can compete in this space — we’ve got great investors, we raised a lot of money and we still have a lot of the money in the bank that we raised, [as well as] cloud-native talent, which is expensive and in high demand.

We focused on building a business, and an enterprise business, early on… and that continues to be our focus, building a successful enterprise business around the open source core.
Tobi KnaupCTO and co-founder, D2iQ

What we’ve focused on since the early days of the company is building products that allow enterprises to use cloud-native tools, all the way back to Apache Mesos, Kafka, Cassandra and so on. DC/OS, which we launched in 2015, made Apache Mesos enterprise-grade. We focused on building a business, and an enterprise business, early on. We built relationships with some of the leading companies in the world. That continues to be our focus, building a successful enterprise business around the open source core.

Enterprise support for cloud-native open source projects such as Istio is also the focus for IBM and Red Hat. How do you go up against them?

Knaup: We have unique skills around data workloads. The original idea for Apache Mesos in 2009 was to build an abstraction layer on top of compute instances where you can run multiple different data services together, and they share the resources. That’s been in our DNA since four years before we even started [Mesosphere] in 2013. And a lot of our customers, that’s why they come to us for help, because they build these data-driven products that have real-time data at their core. We have 10 years of experience doing this, automating these data services, on any infrastructure, whether it’s a public cloud, an edge location, private data centers, even cruise ships for one of our customers.

Rancher has been out there with the multi-cluster management value proposition for a while now. How does Kommander compete with them?

Knaup: One thing that is unique about Kommander, we think, is how closely it ties into the some of the workloads you can run on top. Workloads that are built with Kudo show up in the catalog in Kommander, and you can govern how different projects have access to these workloads. You can say, for instance, let’s pick Kafka as an example. I want to use a certain version that I have vetted in production. So all of my production clusters in Kommander only have access to this stable Kafka version that I have vetted.

But at the same time, I have a lot of development teams that want to have the latest and greatest Kafka version with all the latest features. You can govern that. You can define which projects have access to which of these workloads. That works really well with workloads that are based on Kudo, but supports anything you can you can put in that catalog.

What is the difference between Kudo and the CoreOS Operators that Red Hat offers?

Knaup: One of the main problems we wanted to solve with Kudo is that it’s just way too hard to create an Operator, and way too hard to use them… We actually saw that same problem years ago with Mesos. The equivalent of Operators on Mesos is called Framework. After we had built a few, we realized that there were a lot of common things we could extract into a toolkit. On DC/OS, that’s called DC/OS Commons, and we used it for years to build up our data services instead of building a controller for each individual workload…

Kudo does the same thing and gives you a DSL that’s YAML-based and feels familiar to people that have written runbooks or built automation for software before. They don’t have to know the details of Kubernetes internals, like what is a controller? What is a control loop? Developers can just express their Operator in this configuration file, and then the Kudo universal controller sits underneath that.

The other side of it is to also to make it easier for Operator users… Kudo provides a command line interface that allows you to do things like install, upgrade, backup and restore, and if you know how to install Kafka, you also know how to install Cassandra. That’s all based on feedback we heard from people at KubeCon and our customers and prospects who said, hey, this Operator concept is really complicated, and I tried it out and failed.

Does Kudo use the open source Operator Framework under the covers, or is it completely separate?

Knaup: It uses the controller runtime, which is a project out of Google.

Would you have to be using Konvoy or Kommander to use Kudo, or could it run on an OpenShift cluster?

Knaup: It runs on any Kubernetes. It installs on the cluster using a basic deployment of a pod with a controller, and you interface with it from kubectl. Or if you don’t want that plugin, you can just use the CRDs directly as well.

Interview edited for brevity and clarity.

Go to Original Article

Data silos and culture lead to data transformation challenges

It’s not as easy as it should be for many users to make full use of data for data analytics and business intelligence use cases, due to a number of data transformation challenges.

Data challenges arise not only in the form of data transformation problems, but also with broader strategic concerns about how data is collected and used.

Culture and data strategy within organizations are key causal factors of data transformation challenges, said Gartner analyst Mike Rollings.

“Making data available in various forms and to the right people at the right time has always been a challenge,” Rollings said. “The bigger barrier to making data available is culture.”

The path to overcoming data challenges is to create a culture of data and fully embrace the idea of being a data-driven enterprise, according to Rollings.

Rollings has been busy recently talking about the challenges of data analytics, including taking part in a session at the Gartner IT Symposium Expo from Oct. 20-24 in Orlando, where he also detailed some of the findings from the Gartner CDO (Chief Data Officer) survey.

Among the key points in the study is that most organizations have not included data and analytics as part of documented corporate strategies.

Making data available in various forms and to the right people at the right time has always been a challenge.
Mike RollingsAnalyst, Gartner

“The primary challenge is that data and data insights are not a central part of business strategy,” Rollings said.

Often, data and data analytics are actually just byproducts of other activities, rather than being the core focus of a formal data-driven architecture, he said. In Rollings’ view, data and analytics should be considered assets that can be measured, managed and monetized.

“When we talk about measuring and monetizing, we’re really saying, do you have an intentional process to even understand what you have,” he said. “And do you have an intentional process to start to evaluate the opportunities that may exist with data, or with analysis that could fundamentally change the business model, customer experience and the way decisions are made.”

Data transformation challenges

The struggle to make the data useful is a key challenge, said Hoshang Chenoy, senior manager of marketing analytics at San Francisco-based LiveRamp, an identity resolution software vendor.

Among other data transformation challenges is that many organizations still have siloed deployments, where data is collected and remains in isolated segments.

“In addition to having siloed data within an organization, I think the biggest challenge for enterprises to make their data ready for analytics are the attempts at pulling in data that has previously never been accessed, whether it’s because the data exists in too many different formats or for privacy and security reasons,” Chenoy said. “It can be a daunting task to start on a data management project but with the right tech, team and tools in place, enterprises should get started sooner rather than later.”

How to address the challenges

With the data warehouse and data lake technologies, the early promise was making it easier to use data.

But despite technology advances, there’s still a long way to go to solving data transformation challenges, said Ed Thompson, CTO of Matillion, a London-based data integration vendor that recently commissioned a survey on data integration problems.

The survey of 200 IT professionals found that 90% of organizations see making data available for insights as a barrier. The study also found a rapid rate of data growth of up to 100% a month at some organizations.

When an executive team starts to get good quality data, what typically comes back is a lot of questions that require more data. The continuous need to ask and answer questions is the cycle that is driving data demand.

“The more data that organizations have, the more insight that they can gain from it, the more they want, and the more they need,” Thompson said.

Go to Original Article

New to Microsoft 365 in September—updates to Microsoft To Do, PowerPoint, OneNote, and more

Every update we make to Microsoft 365 is about helping our customers transform the way they work. And this month, we’re introducing updates and features designed to help you collaborate more effectively, work more efficiently, and protect your data more proactively.

We updated the conversation experience for Yammer mobile, so it’s easier to discover and share the content that matters most, and a new PowerPoint for iPad feature makes it (finally!) easy to share a single slide. The updated Microsoft To Do app keeps you productive on the go, and now you can export Visio workflows to Microsoft Flow—a great way to streamline business processes and save your organization money and time. Also new in September: automated incident response with Office 365 Advanced Threat Protection (ATP) for more effective, efficient security.

What’s new in Microsoft 365

This September, learn how to build your business with Microsoft 365.

Watch the video

Collaborate, save time, and improve productivity with app updates

App updates streamline your mobile experience and help you conquer your to-do list across devices.

Join the conversation with beautiful new experiences for the Yammer mobile app—We updated the conversation experience for Yammer mobile, so you can connect, discover, and share in a way that’s easy on the thumbs and on the eyes. Highlights include a new card-based design that sharpens content, easy-to-read formatting and styling, a new grid layout for docs and images that makes it easier to preview and engage with multiple images or files, and link previews and inline videos right in the feed. To experience these improvements, simply update your Yammer app to the latest version.

Image of three phones side by side displaying a Group Conversation in Yammer.

Get more done with the new version of Microsoft To Do—This month, we unveiled a new version of To Do with a fresh design, access from anywhere, and better integration with Microsoft apps and services. You can choose from customization options and backgrounds (including dark mode) to tailor your experience and suit your lists. You can also sync To Do across all platforms to take your lists with you wherever you go. In addition, you integrate To Do with Microsoft 365 apps and services or have one centralized view of your tasks. Think of To Do as your new task aggregator—from Outlook to Planner and Cortana and the Microsoft Launcher on Android, you can now see your whole list in one place.

To get started, download the To Do app. And if you’re coming from Wunderlist, you can import your existing lists in just a few clicks.

Animated image of Microsoft To Do displaying a My Day to-do list.

Share, communicate, and collect feedback more efficiently and effectively

New app features let you share individual PowerPoint slides, stay on top of your notes in OneNote, and conduct more effective surveys.

Share individual slides and overcome public speaking jitters with new PowerPoint features—Sometimes you only need to share that one specific slide, and sending it via email or creating a separate file can be cumbersome. Now, exclusively in PowerPoint for the web, you can share a deck with a link to a specific slide. Just right-click the slide thumbnail, select Link to this slide, and copy the link.

Image of an individual slide about to be copied in PowerPoint.

We also released Presenter Coach in PowerPoint for the web in public preview. Presenter Coach uses the power of artificial intelligence (AI) to give users real-time, on-screen feedback to improve public speaking skills. Rehearse your presentation and get helpful tips on pacing and using inclusive language, as well as avoiding filler words like “basically” or “um,” or developing a rote presentation style.

Stay on top of your most recent notes and adjust your color schemes in OneNote—Updates to OneNote allow you to take your notes with you on the go and personalize your experience. Recent Notes is now also available on Mac, helping you easily pick up where you left off on any device with a chronological list of pages you recently viewed or edited.

On iPad, we integrated OneNote with Sticky Notes, so you can surface your notes there and avoid toggling between apps. Finally, you can now switch to a darker canvas with the new dark mode in OneNote for Mac. All these new features are available now.

Animated image of a recipe being shared in OneNote.

Overcome language barriers in your surveys—The Microsoft Forms Pro app now offers the option to create surveys in multiple languages without having to merge separate documents. Multilingual support enables Pro users to reach a broader audience and helps respondents provide richer feedback by displaying questions in their preferred language. Beginning next month, you’ll be able to access multilingual support by clicking on the ellipsis in the top right corner and selecting Multilingual from the drop-down menu.

Animated image of a multilingual survey in Microsoft Forms.

Export Visio workflows to Microsoft Flow to quickly automate business processes—Designing processes quickly and automating workflows can help accelerate productivity, but it’s often easier said than done. Now you can easily create new automation flows in the familiar diagramming experience of Visio and seamlessly export them as a fully functioning workflow to Microsoft Flow. Built-in Business Process Model and Notation (BPMN) stencils have sharing and commenting capabilities, simplifying development and collaboration. Once the workflow’s complete, you can publish it to Microsoft Flow with a single click. This capability is now generally available to all Visio Plan 2 users through the Visio desktop app.

Streamline IT management and security

New features simplify the management of corporate Android devices, making it easy to add Microsoft 365 customers and ensure endpoints and devices are secure.

Manage corporate-owned Android devices with Microsoft Intune—With support for Android Enterprise fully managed devices, Microsoft 365 customers can deliver a high-quality and feature-rich productivity scenario for employees on corporate-owned devices, while maintaining an extended set of policy controls over those devices. Intune support for Android Enterprise work phone management is now available.

To get started, check out our Intune documentation.

Image of three phones side by side setting up a device as a work phone in Microsoft Intune.

Save time when adding new employees to Microsoft 365—Adding and configuring new employees or freelancers to Microsoft 365 just got easier. Beginning this month, you can create and use a template to save time and standardize settings when adding people in the Microsoft 365 admin center. Templates are particularly useful if you have employees who share many attributes, like those who work in the same role and the same location.

Beginning this month, head to the Microsoft 365 admin center and select Users > Active Users > User Templates > Add Template.

Animated image of a user being added to an engineers list in the Microsoft 365 admin center.

Automate incident response with Office 365 ATP—We’re excited to announce the general availability of Automated Incident Response in Office 365 ATP, which applies powerful automation capabilities to investigation and response workflows, dramatically improving the effectiveness and efficiency of your business’s security teams. These capabilities are available to organizations with the Office 365 ATP Plan 2, Office 365 E5, and Microsoft 365 E5 Security SKUs.

Evaluate security products with ease—The new evaluation lab in Microsoft Defender Advanced Threat Protection removes the challenges of machine installation and configuration. Security experts can verify a potential platform, familiarize themselves with the product, learn about new features, or use the lab environment for attack simulations.

The new evaluation lab is now generally available. To access it, select Evaluation and tutorials > Evaluation lab directly from the navigation menu.

Image of the evaluation lab dashboard in Microsoft Defender Advanced Threat Protection.

Also new this month

  • The redesigned Outlook on the web is now generally available. The redesigned experience features a modern design, new and smarter features, and a faster framework.
  • Now generally available, Security Policy Advisor—introduced in public preview this spring—can examine your policies and provide recommendations to improve security.
  • Windows Virtual Desktop is now generally available, delivering a virtual, multi-session Windows 10 experience. Windows Virtual Desktop enables you to deploy and scale your Windows desktops and apps on Azure in minutes.

As always, every Microsoft 365 update reflects our commitment to improving the experience for you—so if you have feedback or ideas on how we can improve, don’t hesitate to let us know.

Go to Original Article
Author: Microsoft News Center

CIOs express hope, concern for proposed interoperability rule

While CIOs applaud the efforts by federal agencies to make healthcare systems more interoperable, they also have significant concerns about patient data security.

The Office of the National Coordinator for Health IT (ONC) and the Centers for Medicare & Medicaid Services proposed rules earlier this year that would further define information blocking, or unreasonably stopping a patient’s information from being shared, as well as outline requirements for healthcare organizations to share data such as using FHIR-based APIs so patients can download healthcare data onto mobile healthcare apps.

The proposed rules are part of an ongoing interoperability effort mandated by the 21st Century Cures Act, a healthcare bill that provides funding to modernize the U.S. healthcare system. Final versions of the proposed information blocking and interoperability rules are on track to be released in November.

“We all now have to realize we’ve got to play in the sandbox fairly and maybe we can cut some of this medical cost through interoperability,” said Martha Sullivan, CIO at Harrison Memorial Hospital in Cynthiana, Ky.

CIOs’ take on proposed interoperability rule

To Sullivan, interoperability brings the focus back to the patient — a focus she thinks has been lost over the years.

She commended ONC’s efforts to make patient access to health information easier, yet she has concerns about data stored in mobile healthcare apps. Harrison’s system is API-capable, but Sullivan said the organization will not recommend APIs to patients for liability reasons.

Healthcare CIOs at Meditech's 2019 Physician and CIO Forum shared their thoughts on proposed interoperability rules from ONC and CMS.
Physicians and CIOs at EHR vendor Meditech’s 2019 Physician and CIO Forum in Foxborough, Mass. Helen Waters, Meditech executive vice president, spoke at the event.

“The security concerns me because patient data is really important, and the privacy of that data is critical,” she said.

Harrison may not be the only organization reluctant to promote APIs to patients. A study published in the Journal of the American Medical Association of 12 U.S. health systems that used APIs for at least nine months found “little effort by healthcare systems or health information technology vendors to market this new capability to patients” and went on to say “there are not clear incentives for patients to adopt it.”

Jim Green, CIO at Boone County Hospital in Iowa, said ONC’s efforts with the interoperability rule are well-intentioned but overlook a significant pain point: physician adoption. He said more efforts should be made to create “a product that’s usable for the pace of life that a physician has.”

The product also needs to keep pace with technology, something Green described as being a “constant battle.”

There are some nuances there that make me really nervous as a CIO.
Jeannette CurrieCIO of Community Hospitals, Beth Israel Deaconess Medical Center

Interoperability is often temporary, he said. When a system gets upgraded or a new version of software is released, it can throw the system’s ability to share data with another system out of whack.

“To say at a point in time, ‘We’re interoperable with such-and-such a product,’ it’s a point in time,” he said.

Interoperability remains “critically important” for healthcare, said Jeannette Currie, CIO of Community Hospitals at Beth Israel Deaconess Medical Center in Boston. But so is patient data security. That’s one of her main concerns with ONC’s efforts and the interoperability rule, something physicians and industry experts also expressed during the comment period for the proposed rules.

“When I look at the fact that a patient can come in and say, ‘I need you to interact with my app,’ and when I look at the HIPAA requirements I’m still beholden to, there are some nuances there that make me really nervous as a CIO,” she said.

Go to Original Article

Swim DataFabric platform helps to understand edge streaming data

The new Swim DataFabric platform aims to help IT professionals categorize and make sense of large volumes of streaming data in real time.

The startup, based in San Jose, Calif., emerged from stealth in April 2018, with the promise of providing advanced machine learning and artificial intelligence capabilities to meet data processing and categorization challenges.

With the new Swim DataFabric, released Sept. 18, the vendor is looking to help make it easier for more users to analyze data. The Swim DataFabric platform integrates with Microsoft Azure cloud services including IoT suite and Data Lake Storage to classify and analyze data, as well as helps make predictions in real time.

The Swim DataFabric platform helps users get the most out of their real-time data with any distributed application including IoT and edge use cases, said Krishnan Subramanian, Rishidot Research chief research advisor.

“Gone are those days where REST is a reasonable interface for real-time data because of latency and scalability issues,” Subramanian said. “This is where Swim’s WARP protocol makes more sense and I think it is going to change how the distributed applications are developed as well as the user experience for these applications.”

Why the Swim DataFabric is needed

I think it is going to change how the distributed applications are developed as well as the user experience for these applications.
Krishnan SubramanianChief research advisor, Rishidot Research

A big IT challenge today is that users are getting streams of data from assets that are essentially boundless, said Simon Crosby, CTO at Swim. “A huge focus in the product is on really making it extraordinarily simple for customers to plug in their data streams and to build the model for them, taking all the pain out of understanding what’s in their data,” Crosby said.

Swim’s technology is being used by cities across the U.S. to help with road traffic management. The vendor has a partnership with Trafficware for a program that receives data from traffic sensors as part of a system that helps predict traffic flows.

The Swim DataFabric platform moves the vendor into a different space. The Swim DataFabric is focused on enabling customers that are Microsoft Azure cloud adopters to benefit from the Swim platform.

“It has an ability to translate any old data format from the edge into the CDM (Common Data Model) format which Microsoft uses for the ADLS (Azure Data Lake Storage) Gen2,” Crosby said. “So, a Microsoft user can now just click on the Swim DataFabric, which will figure out what is in the data, then labels the data and deposits it into ADLS.”

Screenshot of Swim architecture
Swim architecture

With the labelled data in the data lake, Crosby explained that the user can then use whatever additional data analysis tool they want, such as Microsoft’s Power BI or Azure Databricks.

He noted that Swim also has a customer that has chosen to use Swim technology on Amazon Web Services, but he emphasized that the Swim DataFabric platform is mainly optimized for Azure, due to that platform’s strong tooling and lifecycle management capabilities.

Swim DataFabric digital twin

One of the key capabilities that the Swim DataFabric provides is what is known as a digital twin model. The basic idea is that a data model is created that is a twin or a duplicate of something that exists in the real world.

“What we want is independent, concurrent, parallel processing of things, each of which is a digital twin of a real-world data source,” Crosby explained.

The advantage of the digital twin approach is fast processing as well as the ability to correlate and understand the state of data. With the large volumes of data that can come from IoT and edge devices, Crosby emphasized that understanding the state of a device is increasingly valuable.

“Everything in Swim is about transforming data into streamed insights,” Crosby said.

Go to Original Article

Cohesity Agile Dev and Test quickly delivers clean test data

Cohesity wants to make it easier for developers to get copies of data quickly.

Cohesity Agile Dev and Test, released into beta this week, is an add-on to Cohesity DataPlatform.  The feature makes clones of data sets stored by DataPlatform, allowing test/dev teams to access data without going through other teams.

Cohesity Agile Dev and Test allows DevOps teams to provision backup data without having to go through a typical request-fulfill model.

Usually when developers need a copy of the business’s data for testing or development, they would have to request it from the production or backup teams. This data needs to be accurate and up-to-date for ideal test results, but it also has to be scrubbed of personally identifiable information (PII) and otherwise masked to prevent exposing the test/dev teams to compliance issues. The process could take weeks, which is too long for time-sensitive development such as Agile projects and anything to do with machine learning.

Cohesity claims its software performs data masking before it is pulled to ensure test/dev teams have a “clean” copy to work with.

Similar products that use backup copies of data for test/dev purposes already exist, such as Actifio Sky and Cohesity’s own CloudSpin. The difference with Cohesity Agile Dev and Test is the data doesn’t need to be stood up in its own environment — it doesn’t create a separate silo of data. Sky and CloudSpin spin up the data into another environment, such as a physical server or virtual machine.

screenshot of Cohesity Agile Dev and Test
Cohesity Agile Dev and Test creates data clones, which don’t need to be stood up in another environment, rather than data copies.

Old idea, new implementation

Christophe Bertrand, senior analyst at IT research firm Enterprise Strategy Group, said Cohesity Agile Dev and Test doesn’t solve a technology problem so much as a workflow one. Making copies is nothing new, but a streamlined way to get those copies into the hands of test/dev people is.

“We’ve known for a very long time how to replicate data, but the workflow behind it is not really in place,” said Bertrand. “This is just the next step. It’s exactly what they [Cohesity] should be doing.”

Bertrand said his research shows that many enterprises want to make use of copies of data, to transform from simply having backup copies toward a model of intelligent data management. Businesses want interconnectivity across the organization to ensure any IT operation can access and make use of data. Bertrand said the market is headed in that direction as organizations that can develop faster are inherently at an advantage.

The test/dev people want data that is fresh, compliant, secure and not corrupted. They want to be able to access this data quickly and independently, in order to rapidly turn around their projects. It’s a different audience from the backup admin group, said Bertrand, as they’re not worried about things like availability and RTO/RPO.

The DevOps person doesn’t know or care about the intricacies of DR or backup.
Christophe BertrandSenior analyst, Enterprise Strategy Group

“The DevOps person doesn’t know or care about the intricacies of DR or backup,” Bertrand said.

Bertrand said Cohesity Agile Dev and Test’s main value proposition is that it gives test/dev teams the ability to get great data instantly. They don’t need to know everything about that data.

Cohesity Agile Dev and Test is scheduled for release in the Pegasus 6.4.1 update, expected in late 2019. It will be sold as an add-on capability on Cohesity DataPlatform, and customers will be charged on a usage-based pricing model.

Go to Original Article

New integration brings Fuze meetings to Slack app

An integration unveiled this week will make it easier for Slack users to launch and join Fuze meetings. Zoom Video Communications Inc. rolled out a similar integration with Slack over the summer.

Slack is increasingly making it clear that it intends to incorporate voice and video capabilities into its team messaging app through integrations and partnerships, rather than by attempting to build the technology on its own.

Fuze Inc.’s announcement also underscores how big a player Slack has become in the business collaboration industry. Fuze, a cloud unified communications (UC) provider, opted to partner with Slack, even though it sells a team messaging app with the same core capabilities.

The integration lets users launch Fuze meetings by clicking on the phone icon in Slack, instead of typing a command. They will also see details about an ongoing meeting, such as how long it’s been going on and who’s participating.

Furthermore, Slack’s Microsoft Outlook and Google Calendar apps will let users join scheduled Fuze meetings with one click. Slack previously announced support for that capability with Zoom, Cisco Webex and Skype for Business.

No formal partnership

Slack gave Fuze special access to the set of APIs that made the latest integrations possible, said Eric Hanson, Fuze’s vice president of marketing intelligence. But the companies later clarified there was no formal partnership between them.

The vendors apparently miscommunicated about how to frame this week’s announcement. Within hours on Tuesday, Fuze updated a blog post to remove references to a “partnership” with Slack, instead labeling it as an “integration.”

In contrast, Slack and Zoom signed a contract to align product roadmaps and marketing strategies earlier this year.

In the future, Fuze hopes to give users the ability to initiate phone calls through Slack. Previously, Slack said it would enable such a feature with Zoom Phone, the video conferencing provider’s new cloud calling service.

Slack declined to comment on any plans to expand the Fuze integration.

“There are still some things that Slack hasn’t made available through this set of APIs yet,” Hanson said. “They have a roadmap in terms of where they want to take this.”

Making it easier for users to pick and choose

The voice and video capabilities natively supported in Slack are far less advanced than those available from main rival Microsoft Teams, an all-in-one suite for calling, messaging and meetings. But users want to be able to easily switch between messaging with someone and talking to them in real time.

By integrating with cloud communications vendors like Fuze and Zoom, Slack can focus on what it does best — team-based collaboration — while still connecting to the real-time communications services that customers need, said Mike Fasciani, analyst at Gartner.

“One of Slack’s advantages over Microsoft Teams is its ability and willingness to integrate with many business and communications applications,” Fasciani said.

Fuze also competes with Microsoft Teams. Integrations with Slack should help cloud UC providers sell to the vendor’s rapidly expanding customer base. Slack now has more than 100,000 paid customers, including 720 enterprises that each contribute more than $100,000 per year in revenue.

“Even though Fuze has its own [messaging] app, it doesn’t have anywhere near the market share of Slack,” said Irwin Lazar, analyst at Nemertes Research. “I think this shows Slack’s continued view that they don’t want to compete directly with the voice/meeting vendors.”

Go to Original Article

Data ethics issues create minefields for analytics teams

GRANTS PASS, Ore. — AI technologies and other advanced analytics tools make it easier for data analysts to uncover potentially valuable information on customers, patients and other people. But, too often, consultant Donald Farmer said, organizations don’t ask themselves a basic ethical question before launching an analytics project: Should we?

In the age of GDPR and like-minded privacy laws, though, ignoring data ethics isn’t a good business practice for companies, Farmer warned in a roundtable discussion he led at the 2019 Pacific Northwest BI & Analytics Summit. IT and analytics teams need to be guided by a framework of ethics rules and motivated by management to put those rules into practice, he said.

Otherwise, a company runs the risk of crossing the line in mining and using personal data — and, typically, not as the result of a nefarious plan to do so, according to Farmer, principal of analytics consultancy TreeHive Strategy in Woodinville, Wash. “It’s not that most people are devious — they’re just led blindly into things,” he said, adding that analytics applications often have “unforeseen consequences.”

For example, he noted that smart TVs connected to home networks can monitor whether people watch the ads in shows they’ve recorded and then go to an advertiser’s website. But acting on that information for marketing purposes might strike some prospective customers as creepy, he said.

Shawn Rogers, senior director of analytic strategy and communications-related functions at vendor Tibco Software Inc., pointed to a trial program that retailer Nordstrom launched in 2012 to track the movements of shoppers in its stores via the Wi-Fi signals from their cell phones. Customers complained about the practice after Nordstrom disclosed what it was doing, and the company stopped the tracking in 2013.

“I think transparency, permission and context are important in this area,” Rogers said during the session on data ethics at the summit, an annual event that brings together a small group of consultants and vendor executives to discuss BI, analytics and data management trends.

AI algorithms add new ethical questions

Being transparent about the use of analytics data is further complicated now by the growing adoption of AI tools and machine learning algorithms, Farmer and other participants said. Increasingly, companies are augmenting — or replacing — human involvement in the analytics process with “algorithmic engagement,” as Farmer put it. But automated algorithms are often a black box to users.

Mike Ferguson, managing director of U.K.-based consulting firm Intelligent Business Strategies Ltd., said the legal department at a financial services company he works with killed a project aimed at automating the loan approval process because the data scientists who developed the deep learning models to do the analytics couldn’t fully explain how the models worked.

We’ve gone from a bottom-up approach of everybody grabbing data and doing something with it to more of a top-down approach.
Mike FergusonManaging director, Intelligent Business Strategies Ltd.

And that isn’t an isolated incident in Ferguson’s experience. “There’s a loggerheads battle going on now in organizations between the legal and data science teams,” he said, adding that the specter of hefty fines for GDPR violations is spurring corporate lawyers to vet analytics applications more closely. As a result, data scientists are focusing more on explainable AI to try to justify the use of algorithms, he said.

The increased vetting is driven more by legal concerns than data ethics issues per se, Ferguson said in an interview after the session. But he thinks that the two are intertwined and that the ability of analytics teams to get unfettered access to data sets is increasingly in question for both legal and ethical reasons.

“It’s pretty clear that legal is throwing their weight around on data governance,” he said. “We’ve gone from a bottom-up approach of everybody grabbing data and doing something with it to more of a top-down approach.”

Jill Dyché, an independent consultant who’s based in Los Angeles, said she expects explainable AI to become “less of an option and more of a mandate” in organizations over the next 12 months.

Code of ethics not enough on data analytics

Staying on the right side of the data ethics line takes more than publishing a corporate code of ethics for employees to follow, Farmer said. He cited Enron’s 64-page ethics code, which didn’t stop the energy company from engaging in the infamous accounting fraud scheme that led to bankruptcy and the sale of its assets. Similarly, he sees such codes having little effect in preventing ethical missteps on analytics.

“Just having a code of ethics does absolutely nothing,” Farmer said. “It might even get in the way of good ethical practices, because people just point to it [and say], ‘We’ve got that covered.'”

Instead, he recommended that IT and analytics managers take a rules-based approach to data ethics that can be applied to all three phases of analytics projects: the upfront research process, design and development of analytics applications, and deployment and use of the applications.

Go to Original Article

New Approaches to Home and Xbox Voice Commands Roll Out to Xbox Insiders – Xbox Wire

As Xbox Insiders, your feedback helps inform the decisions and updates we make on Xbox, from new features to how gamers interact with the console itself. Based on your valuable feedback, we’ve been continuing to iterate on two key experiences on Xbox, delivering you a faster Home experience and evolving the way we support Xbox voice commands to improve the voice experience.

Evolving Home

The Home on Xbox One is the first thing you see when you turn on your Xbox One, and we want to deliver an easy and seamless experience for you to navigate your console. We’ve heard your feedback and have continued to iterate on Home to get you into your gaming experiences faster and keeping more of your content front and center. With today’s update, we’re experimenting with a streamlined user interface.

With this new experimental Home design, the first thing you’ll notice is we’ve removed the Twists from the top of Home in favor of separate buttons that launch your gaming experiences. The goal is to let you jump into Xbox Game Pass, Mixer, Xbox Community and Microsoft Store quicker than ever. We’ve also shifted things around to make more room for your recently played titles.

We need your help testing out the new interface. The new experimental Home rolls out this week to select Xbox Insiders in our Alpha and Alpha Skip Ahead rings. For more details on rollout, keep an eye on the Xbox Insiders section of Xbox Wire. The Home experience will continue to evolve and change based on your feedback, so please let us know what you think and share your ideas for Home at the Xbox Ideas Hub. You may see this layout change and even come and go as we iterate on your feedback.

Changes to voice commands on Xbox One

Last fall, we expanded Xbox voice commands to hundreds of millions of smart devices by enabling Xbox One to connect with Xbox Skill for Cortana and Alexa-enabled devices. Xbox Skill continues to grow and change based on your feedback, including new updates that rolled out earlier this month.

Building on these efforts, we are now further evolving the way we support voice commands on Xbox and are moving away from on-console experiences to cloud-based assistant experiences. This means you can no longer talk to Cortana via your headset. However, you can use the Xbox Skill for Cortana via the Cortana app on iOS, Android, and Windows or via Harmon Kardon Invoke speaker to power your Xbox One, adjust volume, launch games and apps, capture screenshots, and more —just as you can do with Alexa-enabled devices today. We’ll also continue to improve the Xbox Skill across supported digital assistants and continue expanding our Xbox voice capabilities in the future based on fan feedback.

Starting this week, this update will roll out to our Alpha Skip Ahead ring and will fully rollout to all users this fall.  As part of these changes, this update will temporarily disable dictation for the virtual keyboard on Xbox One. Don’t worry though, our team is working to provide an alternative solution and will have more details to share soon.

As always, your feedback is important to us and our partners as we continue to evolve the Xbox One Home and shape the digital assistant and voice command experience on Xbox. We have some exciting updates in the works and can’t wait to share what’s next, so stay tuned for more.

Go to Original Article
Author: Steve Clarke