As a sector we are drowning in data. Filtering it all down to something useful is growing more and more difficult. Here, Martin Summerhayes, outlines a strategy that will keep your head above the ones and zeros.
ARCHIVE FOR THE ‘data-analysis’ CATEGORY
Nov 06, 2019 • Features • future of field service • Martin Summerhayes • data analysis • Data Management
As a sector we are drowning in data. Filtering it all down to something useful is growing more and more difficult. Here, Martin Summerhayes, outlines a strategy that will keep your head above the ones and zeros.
Jun 20, 2019 • Features • future of field service • Shaun West • data analysis
How To Develop Databased Solutions
Today any machine can be digitized and connected; collecting the data is not an issue; what is becoming more important is how the field data can be exploited to identify the right action to be taken.
This creates a very complex problem, as the right data must be transformed so that only the right information, at the right time, in the right form can be delivered to the right decision-maker, independently of the problem domain - e.g. route cause analysis, demand forecasting, productivity optimisation, spare parts delivery. Helping people taking decisions can be seen as a smart service, that is designed on the base of a thoroughly understanding of the business complexity.
Ecosystems made of people and equipment, business objectives and strategies, as well as personal needs, attitudes and preferences (must-have’s, nice-to-have’s) of each operators. Once these needs are fully understood, information can be elaborated from data to create the right insights. The Data2Action framework provides guidance towards the development of data-driven services. The understanding of why and how customers interact with assets is achieved using Design Thinking approach.
Principles Of Service Design Thinking Underpin the Data2action Framework
A good design is not only a matter of aesthetics. When designing a product, many factors have to be considered. For example, how the product is going to be used, and by who. This determines the product functions, form, materials, colors, etc. This requires the ability to understand what the product user is trying to achieve (an outcome, an experience).
The same applies to service design, in which the object to be designed is a process which aims to reach a goal, through the use of products, software applications, information, etc. The challenge is that there are many more people involved in the consumption and delivery of the service, the service relies on collaboration, the service is mostly intangible. The Service Design approach is based on a hands-on, user-centric approach to problem definition and idea/solution generation can lead to innovation.
This is of utmost importance, as the application of these principles can lead to competitive advantages. Remember you only do things that are of value to you in one form or other. Service Design Thinking (SDT) is an approach that aims at designing services by applying different tools based on five principles.
Service design thinking should be:
• User centred;
• Co-creative;
• Sequencing;
• Evidencing;
• Holistic.
Understanding The Problem: Why Understand First?
How can a problem or challenge be successfully solved without understanding it properly? Well, it can not. Without a deep understanding, disruptive solutions will not work, or you will be applying sticky plasters. The challenge lying ahead of you is to understand, describe and visualize the situation. The understanding of a complex problem requires to know who the involved people and equipment are, and how the processes in which they are operating runs. The understanding phase of the Data-2-Action framework consists of mapping the (OVERALL) job-to-be-done of the customer, mapping the actors and using avatars to build the ecosystem to discover and appraise who and what is involved.
Principles for Digital Service Development: How To Generate The Best Ideas
The problem statements and the ecosystem visualization developed provides a solid foundation for the development of new ideas and solutions for services. Some new ideas may have already appeared and can be improved in this phase.
These cases, also called scenarios or user stories, can be visualized using the customer journey blueprint. In the customer journey blueprint, the processes, actions, and involved personas/avatars are visualized to display the desired situation, in which the problems are solved.
Outcomes for each actor here should be clearly defined along with any payoffs. Working in pen and paper works really well. For Smart Services with many actors and many machines expect there to be many scenarios to focus on and even more ideas to provide improvements. In the ideation stage, many ideas will be generated.
The ideas need to be rated in order to evaluate which are worth to be prototyped. For selecting the best ideas, an idea scoring system is best.
Building Valuable Solutions: Creating Information From The Raw Data
With the overview of the ecosystem of people, processes, and machines it becomes clear from the scenarios and user stories of where the data is produced and who needs to consume information derived from it. Prototyping it is a way to validate our ideas and possible solutions and it should be fast and keep concepts as simple as possible. This avoids spending too many resources on building solutions only to then finding out that it does not work.
The best way is to create hand-drawn dashboards or widgets which represent the solution and test them as quickly as possible before starting with the actual implementation (often coding!). The process of drawing dashboards may also reveal new ideas which can be useful or new insights into whether the solution is technically visible or not. Many dashboards should be created, to keep it organized we use the Case Actor Matrix (CAM).
This tool allows matching Actors with a Cases (we use a scenario before) and the dashboard enabling the understanding of their purposes - how would you use it to help make a decision. A logical cascade should be build and dashboard widgets should be reused as much as makes sense. These conceptual solutions need to be challenged from a technical perspective.
We use a Source Target Link Matrix (STL Matrix) to show the information needed from the conceptual point of view. We define the requirements and quality of the data needed to develop the dashboards. The matrix distinguishes between existing data and data that needs to be collected, as well as adjustments and improvements that have to be made to the databases
Test Ideas And Improve
The testing is essential within the data2action frameworks and Service Design. It should happen as quickly to avoid the development of solutions, which do not fulfill the identified case and or are not technical visible. The best method for testing the usability is to hand over the dashboard to the target actor and ask them to try to use it and listen to their feedback based on the feedback the usefulness can be improved. New ideas also come from the feedback discussions. The technical aspect needs to be evaluated as well. Meaning, that the information derived from the data is actually significant. This is determined by the data experts and the user.
Dec 13, 2018 • Features • aviation • Data • Future of FIeld Service • future of field service • Blockchain • Cyber Security • field service • IFS • Service Management • Stephen Jeff Watts • data analysis • Managing the Mobile Workforce
Blockchain and its potential has been mooted in field service circles for years. Is it time we stop thinking big and instead build smaller use-cases before we lose sight of what’s actually important, the end-user? Mark Glover, Field Service News’...
Blockchain and its potential has been mooted in field service circles for years. Is it time we stop thinking big and instead build smaller use-cases before we lose sight of what’s actually important, the end-user? Mark Glover, Field Service News’ Deputy Editor finds out more.
In 2008, a person (or a group of people) known as Satoshi Nakamoto conceptualised the first blockchain. A year later, this digitised digital ledger was a critical accessory to the group’s (or his) headline act, the now ubiquitous cryptocurrency Bitcoin.
The impact of this decentralised digital currency on financial markets and a curious, confused society has been fascinating to follow. That the persona of the inventor or the inventors remains unknown adds to the plot.
Yet, without blockchain, the currency wouldn’t function. This smart ledger, driven by a peer-to-peer network has the potential to stamp itself on industry and in particular field service. But can the sector adopt the technology in a way that will ultimately benefit the end-user?
Firstly though, and apologies to all those who have a handle on the technology, what is blockchain? Scouring the internet for a simple definition is tricky, eventually, the excellent forward-thinking mission.com offered this: “Blockchain is the technology that underpins digital currency (Bitcoin, Litecoin, Ethereum and the like). The tech allows digital information to be distributed, but not copied. That means that each individual piece of data can only have one owner.”
"The tech allows digital information to be distributed, but not copied. That means that each individual piece of data can only have one owner..."
Straightforward enough. But let’s expand it to industry. How can it fit into the aerospace sector and specifically a plane engine? Parties involved include the airline, the engine manufacturer and the service company all of whom are squirting data into that asset’s blockchain.
The jet engine is a high-end valuable piece of equipment, the blockchain systems enable a single, irrefutable history of that asset. The linking of parties (blocks) removes the requirement for inter-party consultation before extracting required information meaning critical decisions can be made quicker and more effectively. It’s also secure and visible to everyone and accurate and trust, therefore, is enhanced around the chain. The benefits are tangible. So why aren’t all companies rushing to implement it?
“Like all emerging technologies there are only going to be one or two applications that are going to come up for this kind of thing in the very early days,” says Stephen Jeffs-Watts, Senior Advisor – Service Management at IFS. Stephen is an expert in blockchain, a keen enthusiast of its benefits but warns that fields service shouldn’t get too carried away just yet, particularly as sectors are only starting to dip their toes in the murky blockchain water.
"We have to try and bear in mind that it [blockchain] is also directly proportionate to the type of kit that’s been installed...“
A lot of the use cases that are coming up at the moment,” he tells me, “are in very high-value assets and very highly regulated supply chains; in aerospace, defence, nuclear and very-high-end medical applications,” he pauses. “There aren’t too many Phillips Medicals out there.”
In field service, blockchain technology can potentially trace parts, verify assets and look-up maintenance and operations history, but according to Stephen, it needs to bed-in with modern hardware before its benefits can be felt. “We have to try and bear in mind that it [blockchain] is also directly proportionate to the type of kit that’s been installed,” he warns, “Are you really going to use blockchain to authenticate the asset history or the maintenance and servicing history for a ten-year-old piece of equipment?” Another pause, “You’re not.”
Let’s go back to the jet engine blockchain analogy; the engine itself is a high-end piece of equipment.
The airlines and engine manufacturer, themselves are high-end companies: BA, KLM, Lufthansa, Rolls Royce, GE, Northrup Grumann, for example. All are big companies keen to monetise blockchain, the only real way to do this is through data-ownership but in a high-asset blockchain, this isn’t always straightforward.
Who owns the data from a jet-engine? Is it the airlines?
The thrust from their plane goes through that engine and what about linking that to the pilot who’s flying that aircraft and jet engine through the air? That’s the airline’s data too. They also have a hand in the plane’s load: the number of passengers and baggage, fuel etc. That’s also data from the airline.
The engine itself? Rolls Royce might run it on a power-by-the-hour contract, so it’s their engine, so do they own the blockchain data? Like that other revolution IoT, blockchain becomes an issue of data ownership. What can be done to grease the chains to make the process run smoother?
“You’re going to have to get industries and supply chains to actually come together and solve the underlying data ownership issue,” Steve offers. “There is going to have to be some kind of consensus; an informal consensus through co-operation; the introduction of some kind of industry standard or ultimately an enforced consensus through legislative means,
Be it an Industry standard or a regulatory framework, large-scale blockchain implementation ultimately needs sectors to work together, to come together in agreement and as Steve explains, it also becomes an issue of trust. “Let’s say there are ten people involved in the supply chain: the operator, the Original Equipment Manufacturer (OEM), there may be a service operator; they’re all contributing data to that chain.
“But does the end operator actually have enough trust in the OEM to question if they are going to use their data and benchmark it against its competitors”, he ponders.
Issues around data-ownership, trust and unfit equipment unable to handle what is essentially a large-scale, shared google document are indicators that large-scale field-service blockchain implementation isn’t as close as we might think. Perhaps we are setting our sights too high? Maybe the use-cases should be carried out on a much smaller scale?
After all, cryptocurrency, the original thread of blockchain was designed for electronic financial transactions, not necessarily jet engines. Stephen agrees, referencing a well-known tracking device, he suggests we should keep things simple. “We could use blockchain like a glorified RFID tag that authenticates, verifies and gives you a reference point,” he says. “I can look at the blockchain and I can see who made it, when it was made, how it was transported.
“Where they may be just a couple of parameters about its last usage, you can look at that by a component-by-component type level, specifically in those cases where that kind of information is critical, or the authenticity is critical.
"There’s got to be a realistic level of ambition and some specific use-cases that prove the technology and prove the value of the technology before there comes any mainstream adoption..“
There’s got to be a realistic level of ambition and some specific use-cases that prove the technology and prove the value of the technology before there comes any mainstream adoption,” Stephen urges.
My conversation with Steve has been fascinating and his contribution to this article I’m sincerely grateful for. The insight he offered - most of which I’m unable to fit into this wordcount – was invaluable, yet despite all its potential of blockchain Stephen left me with a thought that goes beyond the blockchain hype: “So what?”
So what if an asset is pumping with blockchain data? All the customer wants is the device to start working again so they can get on with their business.
“What value does that bring to me as a customer,” argues Steve. “unless I’m in a highly regulated environment. When do you start loading up past-maintenance history? Is it good? Is it worthwhile? Probably not. So what’s the use-case that going to give killer value?
Steve continues from the end user's perspective: “Great, you’ve got blockchain. What do I get from you having blockchain? What do I get from being able to prove every last working second of this particular piece of kit? Why should I care?”
It’s an excellent point that perhaps gets lost in this fourth industrial revolution we find ourselves in. Among AI, and IoT and machine learning and blockchain should we not just focus on the customer needs and their requirements? Or will we continue to pursue the hype?
Be social and share...
Nov 16, 2018 • Features • Future of FIeld Service • IIOT • field service • GE Digital • data analysis • Edge Computing • George Walker • Industrial Internet of THings • Novotek • Predex
In the age of the industrial internet of things (IIoT), the speed of data analysis is key to effective operation. Edge computing accelerates this process, allowing for industrial data analysis to be performed at the point of collection.
In the age of the industrial internet of things (IIoT), the speed of data analysis is key to effective operation. Edge computing accelerates this process, allowing for industrial data analysis to be performed at the point of collection.
Here, George Walker, managing director of industrial control and automation provider Novotek UK and Ireland, explains the core benefits of edge computing.
Edge computing is the term for when process data is collected, processed and analysed in a local device, as opposed to being transmitted to a centralised system. Supported by local cloud networks and IIoT platforms like GE Digital’s Predix, systems that support edge computing are proving increasingly popular as a means of streamlining the effectiveness of IIoT networks.
For plant and utility managers, this presents a range of opportunities to not only improve the efficiency of operations but to also overcome some of the limitations of centralised IIoT networks. In fact, there are the three main ways that edge computing drives value in businesses.
Greater operational efficiency
Traditional analysis is undergone by transferring data externally, which can delay decision-making as errors take longer to be found. With edge computing capable systems, large parts of the analysis can be carried out by the devices collecting the data.
The benefits of this are two-fold. For one, this can allow plant managers to access partial deep analysis in real time without waiting on lengthy analysis to be carried out externally. This means action can be taken earlier, streamlining the decision-making process.
The second benefit is that the IIoT platform, such as GE digitals Predix, can automatically respond to operational data. The system will be able to automatically adjust processes in real-time. In effect, this would allow for a self-correcting system that is able to maximise uptime and reduce the need for manual maintenance.
Overcoming network latency and bottlenecks
Traditionally, data analysis is carried out by having smart sensors send all their data to a remote location where it is analysed and processed. This is data intensive and can create problems if a network is not robust enough.
Channelling large amounts can cause network latency, which interrupts working within the plant as there will be a delay with transferring messages that run through the same network.
This is particularly problematic for applications where a system needs to act rapidly to a problem, such as in an industrial oven control system in a food production plant, where even a temporary dip in the temperature can result in a batch being unsuitable for market.
In addition to this, the sheer volume of raw data that can be generated in an industrial or utility plant is also likely to cause data bottlenecks in the wider network.
By using edge computing systems and a machine-learning IIoT platform, systems can respond to changes in real-time to prevent problems, while also having edge computers in place to compress the data and reduce network impact.
Lower operating costs
Due to the amount of information being produced, the cost of data storage is becoming a growing concern for companies. Edge computing and its ability to process data without transmitting it, lightens the load put on the network.
Processed data is also less substantial than raw data as calculations can be made that allow the raw data to be compressed, thus reducing file sizes. As such, industrial companies are able to make more economical use of their cloud servers. By minimising storage requirements and the number of storage upgrades required, edge computing can allow for a lower overall operating cost.
It’s clear that there are many benefits to edge computing, both from a financial and operational perspective. Whether a business is still considering adopting IIoT technology or is already making use of such systems, edge computing marks a step forward for businesses looking to streamline processes for efficiency and effectiveness.
Be social and share...
Leave a Reply