Editor in Chief
Robert Regis Hyle
Michael P. Voelker
Jason T. Williams
ITA Advisory Board
Security First Insurance Company
Millers Mutual Insurance
Craig S. Lowenthal
Berkley Technology Services
Gary H. Ouellette
Union Mutual of Vermont Companies
Glatfelter Insurance Group
Kenneth B. Zieden-Weber
For Advertising contact Jim Daggett
at email@example.com or 631-241-3301
For Editorial, contact Robert Regis Hyle
at firstname.lastname@example.org or 812-747-7159
For reprints and licensing please
contact Jim Daggett at email@example.com
or call 631-241-3301
ITA Pro magazine is published five
times per year in print and tablet forms
and three more in tablet-only form.
Copyright © 2016 by the Insurance
Technology Association. All rights
reserved. No part of this publication
may be reproduced or transmitted in
any form or by any means, electronic,
mechanical, photocopying, recording
or otherwise, without prior written
permission of the ITA.
Autonomous vehicles had a temporary setback earlier this year when a Tesla Model S
crashed and killed the lone passenger in the vehicle. The news reports on this accident
indicate it may have been an error on the part of the passenger/driver who believed the
Tesla model was designed to be fully autonomous. It wasn’t.
Still, many rushed to attack the technology even if it was not the fault of the technology. Others with more wary eyes on how far and how fast self-driving vehicles have
progressed in the last year were skeptical to begin with and are no less so today.
The Washington Post reported that the Model S is not designed to be fully autonomous and if that is the case it appears to be a fatal mistake on the part of the driver/
passenger. I suppose one day, when our streets and highways are crowded with driverless vehicles, we will have much less to worry about when we get inside a car, but we
have a few years ahead of us before such vehicles dominate the boulevards of our fair
There remains much work ahead to make cars safer and when you consider the
preponderance of accidents are caused by driver error you have to believe any technology that can eliminate mistakes—or warn drivers they are making mistakes—is
something we should all strive to support.
A few weeks ago, I was driving home on a dark country road when a deer jumped
in front of my non-autonomous Honda Pilot. I had no time to brake and ran into the
deer. In a discussion of the accident for another story I am writing, Celent’s Donald
Light pointed out one of the problems that still needs to be solved in such situations.
What if the deer was actually a child? We are taught it is safer to go ahead and hit a
deer rather than to try and avoid the collision. Avoiding a deer usually means striking
something else, whether it be another vehicle, a guard rail or a parked car.
If we were able to discern that what jumped in front of our vehicle was a child, not
a deer, our minds tell us to do whatever we can to not hit the child, which, unfortunately, often puts the driver and other passengers in the vehicle in danger.
Driving cars can be complicated, so producing technology that effectively removes
some of those complications won’t come quickly or easily. That means keep your eyes
on the road, even if the technology tells you it’s not necessary. After all, when has technology ever failed us? ITA
Robert Regis Hyle
and Rampaging Deer