Top Banner
T12 Session 4/16/2015 3:15 PM " Software Attacks for Embedded, Mobile, and Internet of Things" Presented by: Jon Hagar Independent Consultant Brought to you by: 340 Corporate Way, Suite 300, Orange Park, FL 32073 8882688770 9042780524 [email protected] www.sqe.com
20
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Software Attacks for Embedded, Mobile, and Internet of Things

 

T12 Session  4/16/2015  3:15  PM  

     

" Software Attacks for

Embedded, Mobile, and Internet

of Things"  

Presented by:

Jon Hagar

Independent Consultant          

 Brought  to  you  by:  

 

   

340  Corporate  Way,  Suite  300,  Orange  Park,  FL  32073  888-­‐268-­‐8770  ·∙  904-­‐278-­‐0524  ·∙  [email protected]  ·∙  www.sqe.com

Page 2: Software Attacks for Embedded, Mobile, and Internet of Things

Jon Hagar

Independent Consultant Jon Hagar is an independent consultant working in software product integrity, testing, verification, and validation. For more than thirty-five years Jon has worked in software engineering, particularly testing, supporting projects which include control systems (avionics and auto), spacecraft, IoT, mobile-smart devices, and attack testing for smart phones. He authored Software Test Attacks to Break Mobile and Embedded Devices; has presented hundreds of classes and more than fifty conference presentations; and written numerous articles. Jon is an editor for ISO, IEEE, and OMG standards.  

Page 3: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

1

Wearables  and  Smart  Technology:    Software  Test  Attacks  for  Embedded,  Mobile,  and  IoT  

Jon  D.  Hagar,  Consultant,  Grand  Software  Testing  [email protected]  

Author:  Software  Test  Attacks  to  Break    Mobile  and  Embedded  Devices  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”           1

*  Definitions  

*  Industry  Error  Trends  Taxonomy  

*  Developer  Attacks  

*  Basic  Attacks  for  the  Tester  

*  The  Big  “Scary”  Security  Attacks  

*  Summary  

Copyright  2015,  Jon  D.  Hagar                  Mobile-­‐Embedded  Taxonomies  from  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”  

2

Agenda  

Page 4: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

2

*  Test  –  the  act  of  conducting  experiments  on  something  to  determine  the  quality  and  to  provide  information  to  stakeholders  *  Many  methods,  techniques,  approaches,  levels,  context  *  Considerations:  input,  environment,  output,  instrumentation  

*  Quality  (ies)  –  Value  to  someone  (that  they  will  pay  for)  *  Functions  *  Non-­‐functional  *  It  “works”  *  Does  no  harm  *  Are  there  (critical)  bugs?  

Copyright  2015,  Jon  D.  Hagar                  Mobile-­‐Embedded  Taxonomies  from  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”  

3

Basic  Definitions  

*  As  the  names  imply,  these  are  devices—small,  held  in  the  hand,  connected  to  communication  networks,  including  *  Cell  and  smart  phones  –  apps    *  Tablets  *  Medical  devices  

*  Typically  have:  *  Many  of  the  problems  of  classic  embedded  systems  *  The  power  of  PCs/IT  *  More  user  interface  (UI)  than  classic  embedded  systems  *  Fast  and  frequent  updates  

*  However,  mobile  devices  are  “evolving”  with  more  power,  resources,  apps,  etc.    

*  Mobile  is  the  “hot”  area  of  computers/software  *  Testing  rules  and  concepts  are  still  evolving  

   

Mobile,  Smart,  Embedded,  IoT  and  Handheld  

Copyright  2015,  Jon  D.  Hagar                  Mobile-­‐Embedded  Taxonomies  from  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”  

Page 5: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

3

*  James  Whittaker  defines  4  fundamental  capabilities  that  all  software  possesses  1.  Software  accepts  inputs  from  its  environment  2.  Software  produces  output  and  transmits  it  to  its  environment  3.  Software  stores  data  internally  in  one  or  more  data  structures  4.  Software  performs  computations  using  input  or  stored  data    

*  To  this,  we  expand  and  refine  based  on  mobile-­‐IoT-­‐embedded  contexts:  *  Within  time  *  Using  specialized  hardware  (as  sub  of  items  1  and  2  above)  and  

control  *  Security  *  Lifecycle  

Copyright  2015,  Jon  D.  Hagar                  Mobile-­‐Embedded  Taxonomies  from  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”  

5

Software  Capabilities  

*  From  Wikipedia:            Taxonomy  is  the  practice  and  science  of  classification.  The  word  finds  its  

roots  in  the  Greek  τάξις,  taxis  (meaning  'order',  'arrangement')  and  νόμος,  nomos  ('law'  or  'science').  Taxonomy  uses  taxonomic  units,  known  as  taxa  (singular  taxon).  In  addition,  the  word  is  also  used  as  a  count  noun:  a  taxonomy,  or  taxonomic  scheme,  is  a  particular  classification  ("the  taxonomy  of  ..."),  arranged  in  a  hierarchical  structure.  

*  Helping  to  “understand  and  know”  

Copyright  2015,  Jon  D.  Hagar                  Mobile-­‐Embedded  Taxonomies  from  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”  

6

Seeing  the  Eyes  of  the  Enemy  

Page 6: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

4

Copyright  2015,  Jon  D.  Hagar                  Mobile-­‐Embedded  Taxonomies  from  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”  

7

Taxonomy    (researched)      Super  Category    

Aero-­‐Space     Med  sys   Mobile   General    Time      3   2   3      Interrupted  -­‐  Satura>on    (over  >me)  

5.5              Time  Boundary  –  failure  resul>ng  from  incompa>ble  system  >me  formats  or  values  

0.5        1      Time  -­‐    Race  Condi>ons  

     3        1      Time  -­‐  Long  run  usages        4        1   20  Interrupt  -­‐    >ming  or  priority  inversions  

0.7   3          Date(s)    wrong/cause  problem  

0.5                                                      1        Clocks      4       2      Computa>on  -­‐  Flow      6   23       19  Computa>on  -­‐    on  data        4   1   3   1  

Copyright  2015,  Jon  D.  Hagar                  Mobile-­‐Embedded  Taxonomies  from  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”  

8

Taxonomy  part  2  Super  Category    

Aero-­‐Space     Med  sys                        Mobile   General    Data  (wrong  data  loaded  or  used)      4   5.00      2        Ini>aliza>on      6   2.00      3   5  Pointers      8   2.00   18   10  Logic  and/or  control  law  ordering    

   8   43      3   30  Loop  control  –Recursion  

   1              Decision  point    (if  test  structure)   0.5   1        1      Logically  Impossible  &  dead  code  

0.7              Opera>ng  system  –  (Lack  of  Fault  tolerance  ,  interface  to  OS,  other)     1.5   2      6      Software - Hardware interfaces

16       13      So9ware  -­‐   Software Interface  

   5   2.00        3      So9ware  -­‐    Bad command- problem on server      3          5      UI  -­‐  User/  operator  interface  

     4   5.00   20   10  UI  -­‐  Bad  Alarm     0.5          3      UI  -­‐  Training  –  system  fault  resul>ng  from  improper  training  

         3      Other   10.6   9.00    5   5  

Note:  one  report  on  C/C++  indicated  70%  of  errors  found  involved  pointers  

Page 7: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

5

*  Requirements  verification  checking  *  Necessary  but  not  sufficient  

*  Risk–based  Testing    *  old  but  tried  and  true  (in  many  contexts)  

*  Attack–based  testing  with  *  New  Attacks  to  support  exploration  *  Model-­‐  based  *  Math-­‐based  *  Skill/experience-­‐based  

   

Where  Do  Testers  Go  Now?    

Copyright  2015,  Jon  D.  Hagar                  Mobile-­‐Embedded  Taxonomies  from  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”  

*  A  pattern  (of  testing)  based  on  a  common  mode  of  failure  seen  over  and  over  *  Part  of  Exploratory  Testing  *  May  be  seen  as  a  negative,  when  it  really  is  a  positive  *  Goes  after  the  “bugs”  that  may  be  in  the  software  *  May  include  or  use  classic  test  techniques  and  test  concepts  *  Lee  Copeland’s  book  on  test  design  *  Many  other  good  books  

*  A  Pattern  (more  than  a  process)  which  must  be  modified  for  the  context  at  hand  to  do  the  testing    

*  Testers  learn  mental  attack  patterns  working  over  the  years  in  a  specific  domain  

   

Attack-­‐based  Testing  What  is  an  attack?  

Copyright  2015,  Jon  D.  Hagar                  Mobile-­‐Embedded  Taxonomies  from  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”  

Page 8: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

6

Attacks  (from  Software  Test  Attacks  to  Break  Mobile  and  Embedded  Devices)  

*  Attack  1:  Static  Code  Analysis    *  Attack  2:  Finding  White–Box  Data  Computation  Bugs    *  Attack  3:  White–Box  Structural  Logic  Flow  Coverage  *  Attack  4:  Finding  Hardware–System  Unhandled  Uses  in  Software  *  Attack  5:    Hw-­‐Sw  and  Sw-­‐Hw  signal  Interface  Bugs  *  Attack  6:  Long  Duration  Control  Attack  Runs    *  Attack  7:    Breaking  Software  Logic  and/or  Control  Laws  *  Attack  8:  Forcing  the  Unusual  Bug  Cases    *  Attack  9  Breaking  Software  with  Hardware  and  System  

Operations  *  9.1  Sub–Attack:  Breaking  Battery  Power    *  Attack  10:  Finding  Bugs  in  Hardware–Software  Communications

   *  Attack  11:  Breaking  Software  Error  Recovery    *  Attack  12:  Interface  and  Integration  Testing    *  12.1  Sub–Attack:  Configuration  Integration  Evaluation    *  Attack  13:  Finding  Problems  in  Software–System  Fault  Tolerance  *  Attack  14:  Breaking  Digital  Software  Communications    *  Attack  15:  Finding  Bugs  in  the  Data    *  Attack  16:  Bugs  in  System–Software  Computation    *  Attack  17:    Using  Simulation  and  Stimulation  to  Drive  Software  

Attacks  *  Attack  18:  Bugs  in  Timing  Interrupts  and  Priority  Inversion  *  Attack  19:  Finding  Time  Related  Bugs    

*  Attack  20:  Time  Related  Scenarios,  Stories  and  Tours    

*  Attack  21:  Performance  Testing  Introduction    *  Attack  22:  Finding  Supporting  (User)  Documentation  

Problems  *  Sub–Attack  22.1:    Confirming  Install–ability    *  Attack  23:  Finding  Missing  or  Wrong  Alarms    *  Attack  24:  Finding  Bugs  in  Help  Files    *  Attack  25:  Finding  Bugs  in  Apps    *  Attack  26:  Testing  Mobile  and  Embedded  Games    *  Attack  27:  Attacking  App–Cloud  Dependencies    *  Attack  28  Penetration  Attack  Test    *  Attack  28.1  Penetration  Sub–Attacks:  Authentication  —  

Password  Attack    *  Attack  28.2  Sub–Attack  Fuzz  Test    *  Attack  29:  Information  Theft—Stealing  Device  Data

   *  Attack  29.1  Sub  Attack  –Identity  Social  Engineering

   *  Attack  30:  Spoofing  Attacks    *  Attack  30.1  Location  and/or  User  Profile  Spoof  Sub–Attack  *  Attack  30.2  GPS  Spoof  Sub–Attack    *  Attack  31:  Attacking  Viruses  on  the  Run  in  Factories  or  PLCs  *  Attack  32:  Using  Combinatorial  Tests    *  Attack  33:  Attacking  Functional  Bugs    

   Copyright  2015,  Jon  D.  Hagar                  Mobile-­‐Embedded  Taxonomies  from  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”  

1:  Developer  Attacks  for    Embedded,  Mobile  and  IoT  

Three  of  many  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices           12

Page 9: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

7

Attack  1:  Static  Code  Analysis  (testing)    

*  When  to  apply  this  attack?    *   After/during  coding  

*  What  faults  make  this  attack  successful?  *  Many  *  Example:  Issues  with  pointers  

*  Who  conducts  this  attack?    *   Developer,  tester,  independent  party  

*  Where  is  this  attack  conducted?    *  Tool/test  lab  

*  How  to  determine  if  the  attack  exposes  failures?    *  Review  warning  messages  and  find  

true  bugs  

*  How  to  conduct  this  attack  *  Obtain  and  run  tool  *  Find  and  eliminate  false  positive  *  Identify  and  address  real  bugs  *  Repeat  as  code  evolves  *  Single  unit/object  *  Class/Group  *  Component  *  Full  system  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  So9ware  Test  ABacks  to  Break  Mobile  and  Embedded  Devices           13

Attack  2:  Finding  White–Box  Data  Computation  Bugs    

*  When  to  apply  this  attack?    *   After/during  coding  

*  What  faults  make  this  attack  successful?  *  Mistakes  associated  with  data  *  Example:  Wrong  value  of  Pi  

*  Who  conducts  this  attack?    *   Developer,  tester,  independent  party  

*  Where  is  this  attack  conducted?    *  Development  Tool/test  lab    

*  How  to  determine  if  the  attack  exposes  failures?    *  Structural-­‐data  test  success  criteria  

not  met  

*  How  to  conduct  this  attack  *  Obtain  tool  *  Determine  criteria  and  coverage  *  Create  test  automation  with  

specific  values  (really  a  programing  problem)  *  NOT  NICE  NUMBERS  

*  Run  automated  test  cases  *  Resolve  failures  *  Peer  check  test  cases  *  Repeat  as  code  evolves  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  So9ware  Test  ABacks  to  Break  Mobile  and  Embedded  Devices          

Page 10: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

8

Attack  3:  White–Box  Structural  Logic  Flow  Coverage  

*  When  to  apply  this  attack?    *   After/during  coding  

*  What  faults  make  this  attack  successful?  *  Many  *  Example:  Statement  coverage  

*  Who  conducts  this  attack?    *   Developer,  tester,  independent    

*  Where  is  this  attack  conducted?    *  Tool/test  lab  

*  How  to  determine  if  the  attack  exposes  failures?    *  Coverage  not  met  and/or  success  

criteria  fails  

*  How  to  conduct  this  attack  *  Obtain  tool  *  Determine  criteria  and  coverage  *  Create  test  automation  with  

specific  values  to  drive  logic  flow  within  code  

*  Run  automated  test  cases  *  Resolve  failures  *  Peer  check  test  cases  *  Repeat  as  code  evolves  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  So9ware  Test  ABacks  to  Break  Mobile  and  Embedded  Devices          

Developer  Testing  Checklist  (partial  for  take  home)  

*  Have  I  tested  path  coverage  *  Have  I  tested    with  MEANINGFUL  

Data  *  Have  I  had  my  code  reviewed  *  Pairs  *  Desk  checks  *  Peer  review  *  Inspection  *  Walkthrough  

*  What  automation  did  I  use  *  Is  integration  done  *  Bottom  up  *  Top  Down  *  Continuous  

*  Have  I  done  static  analysis  of  my  code  *  Me  *  Independent  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”           16

Page 11: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

9

2:  Tester  Basic  Attacks  What  is  missing,  Usability,  Alarms  

Sampling  of  where  to  start  Exploratory  Testing  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices           17

Attack  4:  Finding  Hardware–  System  Unhandled  User  Cases  

*  When  to  apply  this  attack?    *   Starting  at  system-­‐software  analysis  

*  What  faults  make  this  attack  successful?  *  Lack  of  understand  of  the  world  *  Example:  Car  braking  on  ice  

*  Who  conducts  this  attack?    *   Developer,  tester,  analyst  

*  Where  is  this  attack  conducted?    *  Environments,  simulations,  field  

*  How  to  determine  if  the  attack  exposes  failures?    *  An  unhandled  condition  exist  *  Note:  data  explosion  problem  

*  How  to  conduct  this  attack  *  Knowledge  *  Out-­‐of-­‐box  thinking  *  Operation  Concepts  *  Analysis  *  Modeling  *  Lab  testing  *  Field  testing  *  Feedback  *  Repeat  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  So9ware  Test  ABacks  to  Break  Mobile  and  Embedded  Devices          

Page 12: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

10

*  When  to  apply  this  attack?    …when  your  app/device  has  a  user  *  What  faults  make  this  attack  successful?      …devices  are  increasingly  

complex  *  Who  conducts  this  attack?    …see  chart  on  Roles  *  Where  is  this  attack  conducted?    …throughout  lifecycle  and  in  user’s  

environments  

*  How  to  determine  if  the  attack  exposes  failures?  *  Unhappy  “users”  *  Bugs  found  *  See  sample  checklist  

Jean  Ann  Harrison  Copyright  2013  

Attack  :    Testing  Usability  Mobile IoT Usability Tends to be “Poor”

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”          

*  Refine  checklist  to  context  scope  *  Define  a  role    *  Watch  what  is  happening  with  this  role  *  Define  a  usage  (many  different  user  roles)  *  Guided  explorations  or  ad  hoc  *  Stress,  unusual    cases,  explore  options  *  Capture  understanding,  risk,  observations,  etc.  *  Checklist  (watch  for  confusion  of  the  tester)  *  Run  Exploratory  Attack  (s)  *  Learn  *  Re-­‐plan-­‐design  *  Watch  for  Bias  *  Switch  testers  *  Repeat  

 

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”          

Usability  Attack  Pattern  

Page 13: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

11

*  The  developer(s)—see  Attacks  1,  2,  and  3.  *  The  app  architect  or  director  *  On-­‐team    tester(s)    *  In-­‐company  “dog  food”  testers  *  Independent  test  players    *  Mass  beta  trials  *  Not  a  tester—Finally,  consider  who  should  not  be  a  

user  

Note  on  roles:  During  the  testing  effort  and  as  it  progresses,  don’t  forget  that  there  are  many  different  user  roles    -­‐  Newbie,  basic,  advanced,  impaired,  etc.  

Roles  to  Play  in  Assessing  Usability  (and  many  other  Apps)  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”          

*  User  inputs  *  Use  with  optional  “plug”  devices  (readers,  sensors,  trackballs,  mice,  accessories  

etc.)  =>  combinatorial  test  attack  *  Device  “orientation”  and  status  (on  network,  off,  flat,  rotated,  etc.)  *  Ease  of  using  inputs  (1-­‐to-­‐5  scale)  *  Graphic/display  rendering—  Check  (if  they  exist):  *  Fits  in  screen  size  (different  sizes  and  devices)  *  Screen  orientation  (try  all  combinations)  *  Text  —  correct  display,  location,  visible  on  screen  is  the  meaning  clear,  spelling,  

reader  level.  *  Check  the  whole  display  environment  (including  any  hidden  parts)  

*  Etc,  Etc.  ……  

Usability  Attack  Checklist  Example  

(shortened from “Software Test Attacks to Break Mobile and Embedded Devices”)

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”          

Page 14: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

12

*  Normal  and  stress  functionality  of  RFID  and/or  Near  Field  comm  *  Normal  and  stress  functionality  of  optical  tags  and/or  quick  response  codes  *  Normal  and  stress  functionality  of  high  and/or  low  energy  on  blue  tooth  device  (s)  *  Check  M2M  and  H2M  comm  *  -­‐  Web  *  P2P  *  Impact  to  supporting  Apps,  software,  databases,  etc.  

Wearable-­‐IOT  Items  to  Check  (Enabling  Tech)  

May  require  Combinatorial  Attack  

Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices 23

Attack  23:  Finding  Missing  or  Wrong  Alarms    

*  When  to  apply  this  attack?    *   Device  has  alarms  or  information  

notifications  to  drive  user  interaction  

*  What  faults  make  this  attack  successful?  *  Time  or  other  interactions  cause  

notification-­‐alarm  to  be  missed  

*  Who  conducts  this  attack?    *   Tester,  independent  party  

*  Where  is  this  attack  conducted?    *  Tool/test  lab,  field  

*  How  to  determine  if  the  attack  exposes  failures?    *  Alarm  is  missed  or  wrong  

*  How  to  conduct  this  attack  *  Define  alarms  and  conditions  *  Define  risks  of  alarms  in  usage  and  

time  *  Define  strategy  and  test  plan  *  Define  use  cases  *  Define  test  design  within  

environments  including  time  *  Run  tests  *  Review  for  missing/wrong  alarms  and  

cases  to  “force”  *  Leap  year  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”          

Page 15: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

13

3:  IoT,  Embedded,  and  Mobile  Security  Attacks  

 And Now for Something Completely Different

Well, At Least A Very Scary (Not Silly) Walk

25 Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”          

*  Fraud  –  Identity  *  Worms,  virus,  etc.  *  Fault  injection  

*  Processing  on  the  run  *  Hacks  impact  *  Power  *  Memory  *  CPU  usage  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices          

Embedded/Mobile  Security  Concerns  

•  Eavesdropping  –  yes  everyone  can  hear  you  •  Hijacking  •  Click-­‐jacking  •  Voice/Screen  

•  Physical  Hacks  •  File  snooping  •  Lost  phone  

Page 16: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

14

*  Mobile-­‐Embedded  systems  are  highly  integrated  hardware–software–system  solutions  which:  *  Must  be  highly  trustworthy  since  they  handle  sensitive  data    *  Often  perform  critical  tasks  

*  Security  holes  and  problems  abound  *  Coverity  Scan  2010  Open  Source  Integrity  Report  -­‐  Android  *  Static  analysis  test  attack  found  0.47  defects  per  1,000  SLOC    *  359  defects  in  total,  88  of  which  were  considered  “high  risk”  in  

the  security  domain  

*  OS  hole  Android  with  Angry  Birds      *  Researchers  Jon  Oberheide  and  Zach  Lanier  *  Robots  and  Drones  rumored  to  be  attacked  *  Cars    and  medical  devices  being  hacked  *  Stuxnet  Virus  and  its  family  

The  Current  Security  Situation  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”          

*  Apply  when  the  device  is  mobile  and  has  *  Account  numbers  *  User-­‐ids  and  passwords  *  Location  tags  *  Restricted  data    

*   Current    authentication  approaches  in  use  on  mobile  devices  *  Server-­‐based  *  Registry  (user/password)  

*  Location  or  device-­‐based  *  Profile-­‐based  

Security  Attacks  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”          

Page 17: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

15

*  Attack  28  Penetration  Attack  Test    *  Attack  28.1  Penetration  Sub–Attacks:  Authentication  —  Password    *  Attack  28.2  Sub–Attack  Fuzz  Test    *  Attack  29:  Information  Theft—Stealing  Device  Data    *  Attack  29.1  Sub  Attack  –Identity  Social  Engineering    *  Attack  30:  Spoofing  Attacks    *  Attack  30.1  Location  and/or  User  Profile  Spoof  Sub–Attack  *  Attack  30.2  GPS  Spoof  Sub–Attack    

Security  Attacks      (only  a  starting  point  checklist  of  things  to  do)  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”          

§  Security  attacks  must  be  done  with  the  knowledge  and  approval  of  owners  of  the  system  and  software  

§  Severe  legal  implications  exist  in  this  area  §  Many  of  these  attacks  must  be  done  in  a  lab  (sandbox)  §  In  these  attacks,  I  tell  you  conceptually  how  to  “drive  a  car  very  fast  

(150  miles  an  hour)  but  there  are  places  to  do  this  with  a  car  legally  (a  race  track)  and  places  where  you  will  get  a  ticket  (most  public  streets)”  

§  Be  forewarned  -­‐  Do  not  attack  you  favorite  app  on  your  phone  or  any  connected  server  without  the  right  permissions  due  to  legal  implications  

Warnings  when  Conducting  Security  Attacks  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”          

Page 18: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

16

*  These  attacks  are  presented  at  a  summary  level  only  *  Much  more  detail  and  effort  are  needed  

*  Understanding  your  local  context  and  error  patterns  is  important      (one  size  does  NOT  fit  all)  

*  Attacks  are  patterns…you  still  must  THINK  and  tailor    

Wrap  Up  of  this  Session  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”          

Attacks                  (from  Software  Test  Attacks  to  Break  Mobile  and  Embedded  Devices)  

*  Attack  1:  Static  Code  Analysis    *  Attack  2:  Finding  White–Box  Data  Computation  Bugs    *  Attack  3:  White–Box  Structural  Logic  Flow  Coverage  *  Attack  4:  Finding  Hardware–System  Unhandled  Uses  in  Software  *  Attack  5:    Hw-­‐Sw  and  Sw-­‐Hw  signal  Interface  Bugs  *  Attack  6:  Long  Duration  Control  Attack  Runs    *  Attack  7:    Breaking  Software  Logic  and/or  Control  Laws  *  Attack  8:  Forcing  the  Unusual  Bug  Cases    *  Attack  9  Breaking  Software  with  Hardware  and  System  

Operations  *  9.1  Sub–Attack:  Breaking  Battery  Power    *  Attack  10:  Finding  Bugs  in  Hardware–Software  Communications

   *  Attack  11:  Breaking  Software  Error  Recovery    *  Attack  12:  Interface  and  Integration  Testing    *  12.1  Sub–Attack:  Configuration  Integration  Evaluation    *  Attack  13:  Finding  Problems  in  Software–System  Fault  Tolerance  *  Attack  14:  Breaking  Digital  Software  Communications    *  Attack  15:  Finding  Bugs  in  the  Data    *  Attack  16:  Bugs  in  System–Software  Computation    *  Attack  17:    Using  Simulation  and  Stimulation  to  Drive  Software  

Attacks  *  Attack  18:  Bugs  in  Timing  Interrupts  and  Priority  Inversion  *  Attack  19:  Finding  Time  Related  Bugs    

*  Attack  20:  Time  Related  Scenarios,  Stories  and  Tours    

*  Attack  21:  Performance  Testing  Introduction    *  Attack  22:  Finding  Supporting  (User)  Documentation  

Problems  *  Sub–Attack  22.1:    Confirming  Install–ability    *  Attack  23:  Finding  Missing  or  Wrong  Alarms    *  Attack  24:  Finding  Bugs  in  Help  Files    *  Attack  25:  Finding  Bugs  in  Apps    *  Attack  26:  Testing  Mobile  and  Embedded  Games    *  Attack  27:  Attacking  App–Cloud  Dependencies    *  Attack  28  Penetration  Attack  Test    *  Attack  28.1  Penetration  Sub–Attacks:  Authentication  —  

Password  Attack    *  Attack  28.2  Sub–Attack  Fuzz  Test    *  Attack  29:  Information  Theft—Stealing  Device  Data

   *  Attack  29.1  Sub  Attack  –Identity  Social  Engineering

   *  Attack  30:  Spoofing  Attacks    *  Attack  30.1  Location  and/or  User  Profile  Spoof  Sub–Attack  *  Attack  30.2  GPS  Spoof  Sub–Attack    *  Attack  31:  Attacking  Viruses  on  the  Run  in  Factories  or  PLCs  *  Attack  32:  Using  Combinatorial  Tests    *  Attack  33:  Attacking  Functional  Bugs    

   Copyright  2015,  Jon  D.  Hagar                  Mobile-­‐Embedded  Taxonomies  from  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”  

Page 19: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

17

*  To  defeat  an  enemy,  you  must  know  the  bug  

*  The  mobile-­‐IoT-­‐embedded  error  data  is  limited,  what  exists  has  implications  on  test  areas  

*  Taxonomy  used  to  create  attack  patterns  indicates  that  in  industry  many  bugs  should  be  easy  to  find—if  a  few  simple  added  techniques  or  attacks  are  employed  

*  Software  is  in  nearly  everything  these  days  *  IoT/embedded  growing  at  a  scary  rate  

33

Summary  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”          

*  James  Whittaker  (attacks)  *  Elisabeth  Hendrickson  (simulations)  *  Lee  Copeland  (techniques)  *  Brian  Merrick  (testing)  *  James  Bach  (exploratory  and  tours)  *  Cem  Kaner    (test  thinking)  *  Jean  Ann  Harrison  (her  thinking  and  help)  

* Many  teachers  *  Generations  past  and  future  *  Books,  references,  and  so  on  

   

Notes:      Thank  You                                      (ideas  used  from)  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  “So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices”          

Page 20: Software Attacks for Embedded, Mobile, and Internet of Things

4/8/15

18

*  “Software  Test  Attacks  to  Break  Mobile  and  Embedded  Devices”    –  Jon  Hagar  

 *  “How  to  Break  Software”  James  Whittaker,  2003  *  And  his  other  “How  To  Break…”  books  

 *  “A  Practitioner’s  Guide  to  Software  Test  Design”  Copeland,  2004  *  “A  Practitioner’s  Handbook  for  Real-­‐Time  Analysis”  Klein  et.  al.,  1993  *  “Computer  Related  Risks”,  Neumann,  1995  *  “Safeware:  System  Safety  and  Computers”,  Leveson,  1995  *  Honorable  mentions:  *  “Systems  Testing  with  an  Attitude”  Petschenik    2005  *  “Software  System  Testing  and  Quality  Assurance”  Beizer,  1987  *  “Testing  Computer  Software”  Kaner  et.  al.,  1988  *  “Systematic  Software  Testing”  Craig  &  Jaskiel,  2001  *  “Managing  the  Testing  Process”  Black,  2002  

   

Book  Notes  List  (my  favorites)  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices          

•  www.stickyminds.com  –  Collection  of  test  info  •  www.embedded.com  –  info  on  attacks  *  www.sqaforums.com  -­‐  Mobile  Devices,  Mobile  Apps  -­‐  Embedded  Systems  Testing  forum  

 •  Association  of  Software  Testing  

–  BBST  Classes  http://www.testingeducation.org/BBST/  

•  Your  favorite  search  engine  

•  Our  web  sites  and  blogs  (listed  on  front  page)  

   

More  Resources  

Copyright  2015,  Jon  D.  Hagar    Grand  So9ware  Tes>ng,  LLC  –  So9ware  Test  ACacks  to  Break  Mobile  and  Embedded  Devices