This site would like to set some non-essential temporary cookies. Some cookies we use are essential to make our site work.
Others such as Google Analytics help us to improve the site or provide additional but non-essential features to you.
No behavioural or tracking cookies are used.
To change your consent settings, read about the cookies we set and your privacy, please see our Privacy Policy



Digital Business Lawyer
DON'T FORGET: Subscribers can download the latest issue in PDF format - click here to access your account and downloads.

Cracking the code: unethical software and where to cast liability

With the recent revelation that a certain European car manufacturer’s vehicles are equipped with software allowing them to appear less polluting than they truly are, the role of software engineers in the creation of code which has unintended results is under the spotlight. This particular matter echoes an earlier emissions debate involving another manufacturer last year, where the US Environmental Protection Agency discovered that many cars being sold in America had a defeat device - or software - in diesel engines that could detect when they were being tested, changing the performance accordingly to improve results. Concerns about unethical software is now spreading beyond the car industry. Take for instance ticket touts who are using software bots to harvest concert tickets in bulk and resell them at vast markups. Individuals caught in the act are facing unlimited fines in the UK as part of a crackdown on highly profitable resale sites such as Viagogo, StubHub and GetMeIn. John D. McGonagle, Senior Associate at DLA Piper, provides his thoughts on the concerns surrounding unethical software and asks whether there needs to be a total overhaul of the legal system around software engineers and coding.

It has been clear for some time that negotiating modern life requires reliance on software for almost everything. While the ‘internet of things,’ or IoT - a ubiquitous network of computers and electronic devices - isn’t a new idea, it is beginning to feel plausible that in the near future washing machines will text our smartphones to tell us our clothes are dry, fridges will replenish milk automatically, and cars won’t need drivers. In fact, IoT now refers to a group of software and internet enabled technology so potentially vast that the term is beginning to lose its meaning. To sustain such growth and to protect consumers, businesses and those developing software, there is an argument that more needs to be done to identify where responsibility lies, should software start doing more harm than good.

When the exhaust emissions scandal first erupted, the manufacturer in question speculated that the offending software was created by ‘a couple of software engineers,’ who did it for unknown reasons. However, the truth was rather more complex. It seems increasingly likely that there was actually a whole layer of management involved in a ‘normalisation of deviance’ - where people within an organisation become so accustomed to minor deviations that they don’t consider the overall result (an accumulation of deviations) deviant, despite the fact that they far exceed their own limits, be they ethical, moral or otherwise.

In the case of the IoT, the most granular data about individual consumers, down to their thermostat settings, might be available to hackers who can infiltrate insecure or carelessly coded software to access the wireless networks that connect hundreds of devices, or even the devices themselves.

As a profession, software engineers aren’t subject to formal regulation, and there aren’t any widely adopted codes of practice. This absence of what some might call ‘professionalism’ and lack of formal regulation and qualifications isn’t necessarily a bad thing. Many traditional professions have archaic notions of professionalism which can seem hilariously quaint and backward to today’s teenagers and 20-somethings, who more often than not are the ones developing, writing and coming up with the ideas for new software.

Nevertheless, if you are a software engineer who is self-employed, or working for a client on a short-term engagement, there is often no ‘foreman,’ and arguably limited incentives to make sure your work is the best it can be, or of a standard and quality which fosters trust and a healthy long-term business relationship. Interestingly, there are arguably no tangible repercussions if code leads to immoral or unethical results after different uses and iterations. In fact, most commercial software is still shipped with serious flaws. It is perhaps only a matter of time until a software program which is either carelessly or maliciously unfit for purpose results in a catastrophe. In such a circumstance it’s not obvious where liability would fall. Would an incompetent or malicious software engineer argue that it owed no liability to end users? Would they resist or try to ignore claims of malpractice? Would they argue that they owe no obligations to an end user where software is sold or licensed to a manufacturer or reseller, then ‘resold’ to a consumer?

How do organisations protect themselves in these circumstances? From a legal perspective an obvious first step to mitigate risk is to draw up a contract. When properly drafted and negotiated, contracts can be really helpful, but where software development work is outsourced in the digital sector it can often be in informal or briefed-in amidst urgent circumstances, where agile software development is required and it’s difficult to identify which party’s contractual terms apply to the work being carried out. For example, a customer may instruct a software engineer to develop software on its standard terms, and the software engineer may purport to accept this instruction on the basis of its own standard terms (which unsurprisingly are pro-software engineer). This battle of the forms is traditionally won by the party who fired the last shot, that is, the last party to put forward terms and conditions that were not explicitly rejected by the recipient. If the software engineer wins this battle, its standard terms will contain very limited assurances about the performance of the software. The standard terms will also probably contain an express exclusion of the software engineer’s liability if the customer makes unauthorised modifications to the software, and an express limitation of liability preventing the customer from recovering from the software engineer any losses exceeding the price paid in all but the most serious circumstances (i.e. death or personal injury).

Even where the instructing organisation believes its own standard terms are applicable, such standard terms may still be poorly drafted and limited in the warranties (contractual promises) that are granted by the software engineer. For example, the software engineer will probably grant a warranty that the software will perform for a certain period of months in accordance with an agreed technical specification, or have certain functional capabilities or security attributes. However, it would be highly unusual for a contract to contain an express warranty granted by the engineer that the software will not perform in a way which results in unethical or immoral results. The software engineer would argue that its job is to meet the customer’s technical specification, not assess the ethical or moral implications of the software performing as the customer requests.

It could credibly be argued that governing laws are failing to keep pace with a world in which software is so prevalent. Does there need to be a total overhaul of the legal system around software engineers and coding? Will there need to be a number of protracted high-profile cases before principles to help consumers, corporations, software engineers and lawyers are established? Do software engineers need to collaborate to introduce meaningful self regulation, or mandatory codes of practice, before the Government steps in to enforce these kinds of ‘professional standards’ upon them?

While there are no specific codes of practice, there are suggested ways of writing software to prioritise safe, working code. CodeClan, Scotland’s digital skills academy, places emphasis on teaching ‘test-driven development.’ Following this style of programming, no code is written until the programmer has written a test that will validate that the code is working as expected. CodeClan also encourages a concept of pair programming, where two engineers work together on the same piece of code. This ensures code has been reviewed and checked as it is written, rather than either after the fact or not at all. By introducing students to these disciplines early on in their career CodeClan hopes to standardise such software development approaches. It is to be hoped that CodeClan, which has the support of the Scottish Government, ScotlandIS (Scotland’s digital technologies trade body), Skills Development Scotland, and the Scottish Qualifications Authority (‘SQA’), can embed test-driven development in new generations of coders.

At present, it’s arguable that the UK Government is consulting on specific digital regulations with no transparency, accountability, or involvement of the software development industry whatsoever. Without the input of the wider software community, it will undoubtedly take far longer than it should to agree principles and move forward. Sitting down with software engineers and understanding the advantages of ethical coding may be a good place to start if we want to crack this particular code in the not too distant future.

John D. McGonagle Senior Associate

john.mcgonagle@dlapiper.com

DLA Piper, Edinburgh

Search Publication Archives



Our publication archives contain all of our articles, dating back to 1999.
Can’t find what you are looking for?
Try an Advanced Search

Log out of digital business lawyer
Download latest issue PDF
E-Law Alerts
digital business lawyer Pricing

Social Media

Follow digital business lawyer on Twitterdigital business lawyer on LinkedIndigital business lawyer RSS Feed