10 things we’ve learned about quality assurance for IoT

One thing you get used to working in IT is that there is always something new to learn. That can be kind of daunting, but the challenge is also fun. Continuous learning is important for developers and testers alike. In this article, Alex (tester) and Tim (developer) will present some of the things they have learned about testing and quality for IoT in the past year.

1. There is a lot to think about!

The first time we sat down together to look at quality and testing for IoT, we were blown away by the amount of potential risks, challenges and test objects there are. Our first conversations were about just how many things there might be to test: from sensors and 3rd party software, to the use cases, interfaces and edge cases.

The challenges of IoT are well-known. We are both trainers for the certified professional for IoT [1], and the first thing the course addresses is the difficulties that IoT systems present for quality. Examples include:

  • Diversity and fragmentation of devices and systems
  • Few standards
  • The necessity for multiple stakeholders and domain experts to work together
  • Complex scenarios of usage, updating and maintenance and even end of life
  • Security concerns   

You also only need to look at examples from the media to see how IoT solutions can be used in ways we aren’t good at imagining yet (e.g. smart home technology being used in domestic abuse cases [2]) or for embarrassing failures such as Alexas ordering dolls houses [3].

 

2. Functionality isn’t the only quality attribute

Another instant realisation is that functional testing (while important) is only a small part of the essential testing. Many software projects today can “get away” with doing mostly functional testing and minimal performance or load testing. For IoT systems however, the whole breadth of quality attributes should at least be considered. Take for example reliability, which includes aspects such as availability, fault tolerance and recoverability [4]. A recent example of this was a problem with Nest, that led to heating outages in winter [5]. Graceful failing and the ability to recover from errors are incredibly important when such systems are interwoven with our daily lives.

 

3. We can’t even decide to just “consider all the quality attributes”

The long list of quality attributes might lead us to believe that we’re just going to have to “test all the things”. Unfortunately, this is just as impossible as it has always been. In IoT systems, some quality attributes might compete with each other. The more maintainable or user friendly a system is, the less secure it might be. New data protection rules might mean that a solution or a feature isn’t even viable: it would solve a problem but would not conform to data protection. For each system we design, we need to discuss which aspects are important, how quality attributes will affect each other - and which ones should have priority.

 

4. We’re going to have to be more pragmatic and explicit

Discussions like the ones above will necessarily lead us to being more explicit about which aspects or risks we are not testing. Non-tested attributes are theoretically a part of any quality strategy, however, we often tend to gloss over them. Making informed decisions about where our priorities are and what we are deliberately ignoring (until risk factors change) is just as important as deciding what to test.

Coupled with this is a need for pragmatism in testing. We know that we can’t test everything - but we might also need to define what “good enough” quality is in each project’s context. The balance between the risk of (catastrophic) failure and the risk of trying to test too much (which is costly) will be a hard one, but a necessary one for competitive advantage.

 

5. Our driving factor with all decisions needs to be value

Many of the talks we’ve seen about IoT solutions seem to succumb to a joy of playing with new technologies. That’s obviously important to get a handle on new things, but we need to be careful that the systems we design are actually solving a problem. Ideally, any digitised system should improve on an existing system (automating a process, increasing decision making capabilities, reducing costs or error, adding comfort). This might seem a far reach from “testing”, but the quality of a system is also in its usefulness and whether it solves a problem. Making sure we know what the use cases and business cases are will help us to create systems that don’t just do “cool stuff”, but actually improve our processes or quality of life.

As an example: if I create a system to avoid sending an expert to Australia once a month, then I should at least consider whether the costs and risks of sending the expert are balanced by the creation and maintenance of the system… If it turns out that the system is considerably more expensive and doesn’t mitigate any risks - or if the domain expert is no longer needed, but a technical expert needs to travel every two weeks to update the system, then I may have missed the mark!

Our tip: Asking ourselves “what’s that for?” and “who or what are we targeting” will help us to remember that IoT isn’t primarily about technology - it’s about people. It’s about the value we gain from the IoT solution. No user or customer pays for an awesome technology stack, they pay for the added value that the system brings them.

 

6. We can’t relax once the software is deployed

It used to be the case that the software lifecycle pretty much ended with deployment (from a development perspective). Operational services were responsible for any runtime problems later on. With DevOps, the separation between development and operations was reduced - and this is even more important in IoT projects. The sheer volatility and pace of change (not to mention security patches) mean that updates will be necessary, frequent, and will need to be tested in the environment they are deployed in.

Even end of life will require more attention from a quality perspective. A traditional web application project’s end of life may be as easy as shutting down some web servers or cloud instances. The distributed hardware in an IoT project makes the end of life process much more complicated. Should IoT hardware run forever? Is that even imaginable or desirable? How can we test or certify the longevity of such solutions? What steps need to be taken to ensure a high-quality end of service in terms of data and comfort?

 

7. It’s technologically challenging but also socially challenging

It’s natural to see the list of languages, protocols and frameworks for IoT and wonder how we can master them all. Nevertheless, alongside the technological challenges, the inclusion and intrusion of IT systems into our society and private life also means that we are facing social and ethical challenges more than ever before. The amount of people affected by systems is increasing - and with it our amount of potential stakeholders, users and abusers.

Ethical considerations are important from the beginning of a project and must be reviewed based on new findings and experience. We need to learn from our own and others’ mistakes (e.g. taxi apps who hike the prices when your battery is low) and consider ethics as an important quality attribute for modern systems. A good starting point for a framework is to look at the ACM code of ethics [6].

 

8. Agility is going to become even more of a success factor

The agile principles help teams and organisations to respond to change, to deal with unpredictability and to prioritise timely feedback. For emerging markets and new technologies, it makes sense to follow principles that embrace unpredictability and allow for adaptability. Agility is people-, quality- and value-focused. Following the principles (as opposed to only rituals within a specific framework) will be necessary to ensure high quality valuable solutions for customers.

A specific example of this is the importance of cross-functional teams in agile. The more domains that our systems connect, the less likely it is that any one person has understanding and mastery over them. In a pilot project with three companies last year, we realised the importance of having customer input for the domain specifics, as well as hardware experts and technology / data experts on board alongside our normal development teams.

 

9.  Testing is still going to be important but it might look different

One prediction that technophiles are fond of giving is that testing will be completely automated and exclusively technical. One thing that we both agree on is that automated testing alone will not be sufficient. We already know that we can only automate against known risks - and the amount of potential “unknown unknowns” [7] in this new area is high. Good exploratory testing skills are going to be crucial.

Where we still disagree is on how technical testers will have to become in this new landscape. That is something that project experience will show.

 

10. Is it really just IoT?

One of the most interesting lessons we learned from our efforts was that we’re not sure whether “quality in IoT” is even the right name for what we’re working on! IoT is just one way of creating modern systems and networks to help people and add value. It certainly brings challenges, but we warned above about becoming too technology-centric. The points we’ve made in this article are just as valid for any modern, distributed, data-processing digital system.

 

Summary

Based on these lessons in quality for such distributed digital solutions, we can try to predict what will be important. However, as mentioned in point 8 - the most important skill will be reacting to changes and learning as we go.

Which leads us to our final point - considering the speed of innovation in this area, we’re surprised to find relatively few people talking about quality. We don’t really want to believe that people still underestimate its importance, although this could be a factor. Our current theory is that we, as an industry, are more focused on getting IoT solutions to work at all. We hope this changes soon! Quality cannot simply be added later, and the complexity of the systems we’re creating means that a lack of quality could cause serious consequences.  


References

[1] https://isqi.org/en/asqf-certified-professional-for-iot-cpiot and https://www.bredex.de/leistungen/weiterbildungen/details/asqfr-certified-professional-for-iot/

[2] https://www.wired.co.uk/article/internet-of-things-smart-home-domestic-abuse

[3] https://www.theregister.co.uk/2017/01/07/tv_anchor_says_alexa_buy_me_a_dollhouse_and_she_does/

[4] https://iso25000.com/index.php/en/iso-25000-standards/iso-25010

[5] https://www.cbc.ca/news/technology/nest-smart-home-problems-1.3410143 and

https://www.telegraph.co.uk/technology/news/12099033/Nest-thermostat-owners-left-without-heating-after-software-glitch.html

[6] https://www.acm.org/code-of-ethics

[7] http://www.womentesters.com/all_roads_lead_to_et/


 

Kontakt-Icon