Passing the Technology Stress Test: a critical step for successful technology deployment

The Oil & Gas industry has truly pushed the boundaries over time and delivered projects that were considered unthinkable not too long ago. Technology has truly been an enabler! However, in case technology is ‘only’ seen as a project enhancer, it’s far more challenging to get the listening ear. Also, the moment a project is delivered and production starts, the innovative mindset often changes, and the Oil & Gas industry ‘suddenly’ becomes a risk-averse industry. There are various good reasons for that. Every day we need to remind ourselves that we’re dealing with hydrocarbons. We want to make the most of our assets, and not unnecessarily experiment with new approaches to put humans, production, environment, reputation or careers at risk. ‘No one’ wants to be the first, and the word ‘new’ scares people off.

 

And the above is not unique for Oil & Gas… But this doesn’t mean that there is no room for technology & innovation to get the most out of existing assets or to enhance projects.

 

The good news is: technology deployment can be done. In the past years, the founders of Deployment Matters led deployment activities in a major Oil & Gas operating company. Over 600 deployments were successfully delivered, with very significant impact on safety, production and cost.

 

A typical challenge that often comes up when discussing technology deployment challenges: “our industry is very risk-averse”. There is truth in it, but our experience is also that ‘risk-averseness’ is often confused with other factors that are at play, such as:

  • End-users often don’t know where to start with technology.
  • End-users don’t have the time and/or required expertise.
  • Technology competes with many other things that can be done to improve performance.

By understanding better what the key reasons are why technology is not picked up, we can do something about it. We have captured our experience in tools, processes and critical success factors, applicable to the Oil & Gas as well as other industries. One of the tools is the Technology Stress Test, based on >20 years of technology deployment experience. It gives key insights of what needs to be done to make technology deployment happen. Technology is assessed against ~30 criteria grouped under 7 themes, through a structured dialogue with key stakeholders around the table. Based on the outcome of the Technology Stress Test, specific actions can be taken to increase the chances of success that your technology gets deployed.

 

The Technology Stress Test applies to any (internal/external) technology provider that wants to see more pull from customers. It also applies to users that want to know what actions need to be taken to successfully deploy a chosen technology. The Technology Stress Test is most effective with all players around the table; not only representatives from the technology provider (including marketing/sales), but also representatives from the various teams within the end-user company that play a critical role for making the technology deployment happen.

 

If a technology has already been used multiple times, there is less need for a Technology Stress Test. But particularly when the technology is in early stages of development of deployment, the Stress Test gives key insights of what is needed to get the technology going.

 

Our experience has been that technology deployment is mostly driven by non-technical aspects, and the Technology Stress Test reflects that. It is also our experience that 3-5 well-defined, targeted actions are often sufficient to get the technology deployment going. Based on our experience, we can support you with facilitating the Technology Stress Test and get you from A to B. If needed, we can support you all the way to successful deployment and sustainably embedding the technology into your business. Throughout the dialogue, it is important to keep in mind that it’s not the actual score that matters. What matters is the quality of the dialogue, and the actions as a result of the dialogue that can be completed within the desired timeline.

So, what are the themes that make up the Technology Stress Test? Let’s go through them, one by one. Based on the dialogue, a score will be given in the range of 0 – 5 per theme based on the current situation. Also, as part of the dialogue, actions are discussed and agreed such that the chances of success can be increased where needed, reflected in a higher score.

 

The first theme may sound trivial and easy, but actually it is not. It is important to make a realistic assessment of the absolute impact in terms of cost, production (or uptime), safety exposure reduction, CO2 emission reduction or potentially other measures. And it also needs to be put in perspective and compared to other things that can be done to improve performance. The impact number itself will not directly determine the score, it serves to start the dialogue.

 

The dialogue as part of this theme will be around the following.

 

Who is the specific person to whom you are ‘selling’ the technology? Note: it must be the person who benefits from the technology and gets it done (or makes sure that it gets done by influencing the decision maker).

 

How significant is the impact for this specific person, on a scale of 0-5?

 

For example: a technology can result in 100k euro OPEX savings per year. The engineering lead you are targeting has a budget of 1 million euro per year. In this case, the impact is very significant ⇒ 5. If you would be targeting a person who holds a budget of 100 million euro per year, the relative impact is low.

 

As part of the structured dialogue, we will explore whether the right person is targeted. Is this really the person who has the most interest in the technology? Should you go one level down in the organisation (such that the business impact is of sufficient interest to the person you’re targeting), with the risk that the person has less authority to take decisions? Or should you go a level up, with the risk that the technology is of relatively less importance, and the risk that the person one level up can devote less time given the many other responsibilities that he/she has? Or perhaps target a person in a different line of the organisation? Or a combination (with the risk that there’s no clear owner to move things forward)? Does the person you’re targeting only have decision authority in a local organisation, or does the person have the authority to take decisions for deployment globally? These and other points will come up as part of the dialogue.

 

Part of this theme also is: are you presenting the technology in the right way, such that it resonates with the person you’re talking to? Should you change your pitch? If your messages are not right, the impact of the technology may be perceived as being lower than it actually is.

 

Also keep in mind: the higher up you go in the organization, the crisper you need to be in the delivery of the message. Furthermore, keep in mind that your messages must be understood by people from different backgrounds. Hence it is important to avoid jargon and abbreviations.

 

Whenever we do a Technology Stress Test, we ask people to introduce the technology using their existing marketing material. And we then hold up the mirror, and give feedback based on our technology deployment experience; feedback from the perspective of end-users in Oil & Gas companies. One of the outcomes of the Technology Stress Test typically is to change the messages, such that it better resonates from the perspective of the end-user.

The best technologies are those that reduce cost AND increase production (or uptime) AND improve Health/Safety/Environment. Naturally, not all technologies will positively impact all these three elements.  If technology scores negative against one of those elements, then you can be sure that you will encounter resistance: from the people who own the performance indicator that is negatively impacted. It essentially means that you need to target a person one or more levels higher in the organisation, the person that owns all relevant performance indicators. Determining the person to whom you need to sell the idea is largely driven by Theme 2, in combination with 1.

 

The dialogue as part of this theme will be around the following descriptions. Based on the descriptions given below, what is the current situation for your technology? Can you articulate the value in a different way, such that the impact on other metrics comes out more clearly?

0. The technology is not competitive with conventional solutions that can be applied to improve performance.
1. The technology improves performance on one metric [HSE, cost, production]; but has a negative impact on the other two.
2. The technology improves performance on two metrics [HSE, cost, production]; but has a negative impact on the other.
3. The technology improves performance on one metric [HSE, cost, production]; and keeps performance on the other metrics constant.
4. The technology improves performance on two metrics [HSE, cost, production]; and keeps performance on the other metric constant.
5. The technology improves HSE performance AND cost AND production performance.

 

Keep in mind that even when your technology has zero impact on one of the metrics, there can be resistance. After all, people may be afraid that the introduction of technologies will e.g. distract people. If your product has positive impact on multiple dimensions, then important to articulate it. The more champions you will have for the technology!

There are many examples of technologies where one part of the organisation gets the rewards, while another runs the risk. Or technologies that help the operator and eat into the profit of suppliers. The more a-symmetric the risk/reward profile, the more resistance you can experience. The best technologies are truly win-win for all parties involved. If it is not, then it is at least important to be fully aware, such that you can address the imbalance as much as possible. As part of the Technology Stress Test dialogue, we will give specific suggestions how the imbalance can be addressed.

 

The dialogue will be around the following descriptions. Our experience: very often the reasons why technology deployments get stuck are related to this theme, hence important to have a very good discussion around it. If you don’t, you may find forces at work that openly or quietly block the deployment. The key is to create a winning team.

 

0. One part of the company using the technology gets the benefits; other teams/people involved are negatively impacted; the regular service provider sees a reduction of revenue.
1. One part of the company using the technology gets the benefits; other teams/people involved have no benefits; the regular service provider sees a reduction of revenue.
2. The company using the technology gets the benefits (all teams/people); the regular service provider sees a reduction of revenue.
3. The company using the technology gets the benefits; the regular service provider has no benefits.
4. The company using the technology gets most benefits; the regular service provider benefits as well to an extent.
5. Balanced rewards across all players.

 

To illustrate the above a bit further: there are many technologies that can improve maintenance activities. In order to introduce these technologies, providers often have to work through maintenance contractors to get those to the end-user. Do these maintenance contractors have the incentives to introduce the innovations? Or would they not do it, because it reduces their revenue? Is it more attractive for them to promote conventional solutions?

 

And how to handle the introduction of the technology that will result in job losses? How can you still introduce such technologies while managing the resistance that you may encounter? Dependent on the specific situation, we will share examples from our experience how we have handled the situation.

 

Under this theme, technology is assessed against 5 specific change management aspects that are relevant for getting technology deployed. The more complex the change is, the more change management that is required. The less change needed, the smoother the deployment will go.

 

Give 1 point for each item met. Take into account overall complexity/scale.

  • Can the technology be deployed without making any changes to the hardware of the facilities? If not, what actions are needed? Are these minor changes, or is it a project in itself?
  • Does the technology make use of existing data, IT hardware and integration? If not, what changes are needed?
  • Is the technology compatible with current processes/ways of working? If not, articulate what will have to be done differently. Would this e.g. require training of people?
  • Can the technology be covered from existing budgets? If not, what is needed to get the budget? Does it e.g. have to follow an annual budget cycle, with impact on timing for the deployment?
  • Is the technology in line with local rules & regulations? If not, does this require changes to the technology, or a dialogue with the regulator to change the rules & regulations?

 

As with the previous themes, it is important to not make this a tick the box exercise, but to use the descriptions below to have a quality discussion around how much change is needed to introduce the technology, and what actions need to be taken to make the change as smooth as possible. If the amount of change is (perceived to be) too significant, then people may prefer to focus on other things they can do to improve the business.

 

But an apparent small change can also have implications. We once were involved with the introduction of a very high-value technology that required steel cutting and welding. A small project was created around this with a total value of $0.5 million. Now in the world of projects of hundreds of millions, or billions of dollars: who gets the lead over a $0.5 million project? The junior engineer fresh from university, who was given the project to learn on-the-job. Fortunately, in this case the person did a great job, and the technology was deployed successfully. It eventually helped to extended the field longer than expected, and resulted in extra recovery of >>1 million barrels! But we have also seen cases whereby the learning experience offered to a graduate resulted in a very slow deployment process.

It is often said that the Oil & Gas business is conservative, and there is a degree of truth in it. In all fairness though, it is understandable that people don’t want to unnecessarily experiment with new approaches to put humans, production, environment, reputation or careers at risk. The same way that we wouldn’t allow a stranger to experiment with the roof or the water piping system of our own house. The quickest wins can be realised by deploying technology successfully used or assessed by others. And usually the ‘closer’ the other user, the more convincing the deployment reference will be. Like in our daily life.

 

The criteria listed below may sound trivial. Our experience is though that many users tend to believe that their business situation is unique; it hardly ever is, there are almost always analogues. And even if the specific technology may be novel, it usually contains underlying components that have been proven already. For example, last year we did a Technology Stress Test around a new type of pump. Although the product itself is new, it made use of principles that have already been proven, and used in pumps that have been in operation for many years ⇒ a great reference to address questions that users had about the reliability of the novel pump.

 

Actions related to this theme therefore very often are that we link up a potential user in one company with another user who has experience with the technology already (or with technology of similar nature). Nothing helps more than a direct conversation between users, and we can use our network and experience to connect you with the right people, and facilitate an exchange. As a supplier, it’s in your interest to facilitate such exchange. Naturally, if users have less positive experience, make sure that you can explain what happened, and the steps you have taken to improve the product.

 

Use the matrix and the questions/pointers below to guide the discussion. Based on that, make a judgement of the current score, and the actions you can take to increase the score.

 

  • Has the technology been used before? Can you refer to technologies that make use of similar principles? Have components already been used successfully, such that you can use that reference to e.g. address reliability concerns?
  • In case the technology has been used or assessed before: is the target end-user in touch with the other users/experts? Are users/experts prepared to share their experiences? If not, what can be done such that they are prepared to share?
  • How important is it for the end-user that there is industry experience? In other words, would the end-user be prepared to be the first? Or is the culture/strategy in the company of the end-user such that they are a (slow) follower?

 

 

The themes discussed so far were mostly non-technical. It is indeed our experience that for successful technology deployment, technical facts are not sufficient. Particularly a deep understanding of human behaviour is critical. But obviously the technology has to be technically sound. These aspects are covered under this theme. If you don’t meet criteria as part of this theme, then the specific actions that are needed are relatively straight-forwarded, certainly for the first three bullets. Typically, most discussion is around the last two bullets…

 

Give a score 0-5 based on the below.

  • Are the product specifications supported by evidence?
  • Is the technology suitable for the specific application? Do in-depth technical review/studies confirm the applicability?
  • Is the technology in line with industry standards?
  • Does the technology have the explicit support from the relevant expert? Is his/her opinion (widely) known and do you make use of the review when promoting the technology? Is the view accepted by the end-users?
  • Does the user have the capability and know-how to support the technology deployment and to sustainably embed the technology?

 

Last, but certainly not least! Under this theme, a structured dialogue takes place around five specific questions that the Procurement department in the end-user company may raise. Again, there may be other elements, but these are elements that have come up many times during the years that we were involved in technology deployment. Needless to say that it’s important to have someone from Procurement in the end-user company around the table, if you suspect that the deployment hurdles are related to this theme!

 

Give 1 point for each item met. Note: these questions are formulated from the perspective of Procurement in the end-user company.

  • Are there multiple suppliers for this technology?
  • Are tendering requirements being met?
  • Can the technology be obtained through a contract with an existing supplier, either directly or indirectly?
  • Does the supplier already have a presence in the country?
  • Is usage of the technology in line with the Procurement key performance indicators?

 

Like with the previous themes, it is important to not make this a ‘tick the box’ exercise, but to use this to trigger ideas on the deployment path with the least resistance. To illustrate the above with a practical example: in the past years, we rolled out a novel scaffolding technology. This scaffolding technology was developed by a small company, and delivered to the end-user through a large maintenance contractor. Following the first successful deployment, the target customers for deployments 2 and 3 were simply determined by asking the question: where else do we work with this maintenance contractor? Can it be delivered through existing contracts already in place, leveraging existing relationships?

 

Once the assessment against the 7 themes is done, the scores can be mapped on a spider diagram. And from there it can be translated into a simple Impact versus Do-ability chart, with for Impact the lowest of Themes 1 and 2; and for Do-ability the lowest of Themes 3-7. The Impact versus Do-Ability chart is particularly useful to summarise the outcome of multiple Technology Stress Tests in one plot.

 

Often when we facilitate a Technology Stress Test, the question comes up: “so if we score low on one do-ability theme, it impacts the overall do-ability score”? The answer is YES! The weakest link determines the overall strength.

 

In terms of the colour coding of the Impact versus Do-Ability chart: naturally the colouring changes when you’re close to a major milestone. If a project is taking Final Investment Decision next week, then the matrix is dark red… In other words, timing is key.

 

Assessment of technology using our stress test tool, based on our deployment experience obtained over the past 20 years. Technology is assessed against ~30 criteria grouped under 7 themes. Once the assessment against the 7 themes is done, the scores can be translated into a simple Impact versus Do-ability chart. Based on the outcome of the Technology Stress Test, specific actions can be taken to increase Impact/Do-ability

 

How does the matrix – and the Technology Stress Test – look like for technology that is a true enabler for a project? In other words: without the technology, there’s no project? In that case: the question whether or not to pursue the technology doesn’t depend so much on the technology, but how attractive the project is compared to other investment opportunities that your company has. There are not so many technologies that truly are an enabler…

 

Now what if your technology didn’t pass the Technology Stress Test, despite specific actions to increase the Impact/Do-ability? Will my technology have no chance? Not necessarily. However, it usually does mean that you first need to build trust, by delivering value through technologies that do pass the test. And if you don’t have such technologies in your own portfolio, then it is important to be part of a ‘bundle’ with technologies from other suppliers. We can give you various tactics as part the Technology Stress Test exercise.

 

Once your technology passed the Stress Test, will it be adopted automatically?

 

Passing the Technology Stress Test is a prerequisite for successful deployment, but not sufficient. Getting your technology successfully deployed is a lot of hard work, and the devil is in the detail. Operators can support the uptake of technology by streamlining their Technology Deployment organisation, which is a topic in itself. We are advising multiple Oil & Gas companies on how to organise technology deployment effectively  please contact us if you want to hear more what we can do for your company.

 

However, a successful Technology Stress Test is a solid basis! As indicated in the introduction, it essentially formed the basis for >600 successful deployments in recent years. The moment you’ve gone through the exercise a number of times, it becomes second nature. Technology Deployment is difficult enough, and therefore important not to introduce unnecessary barriers. Once the momentum is there through successful initial deployments, the wider replication goes quickly!

The Technology Stress Test is available to suppliers as part of the Technology Catalogue Premium subscription. It is also discussed in detail during our Technology Deployment workshops.

 

Contact us at info@deploymentmatters.com for further info or questions.

Disclaimer
User of the Technology Stress Test acknowledges and agrees that Deployments Matters cannot guarantee that User’s technology will succeed. User shall be solely responsible for making all decisions and taking actions related to its business. User hereby acknowledges that it is not possible to guarantee that the technology subject to the Stress Test will be successful within a specified time frame or at all. In particular, the User further acknowledges that it is not possible for Deployment Matters to guarantee that the outcome of the Stress Test will generate any business. So long as Deployment Matters complies with the obligations of this Agreement, the User hereby acknowledges and agrees that Deployment Matters shall not be liable for the failure of the Stress Test to generate any useful business.

Newsletter

Sign up for the Deployment Matters newsletter to stay up to date with our ongoing developments!

Share

This website makes use of cookies to ensure that the website works properly. Settings

Back to top