Baixe o app para aproveitar ainda mais
Prévia do material em texto
Digital Forensics with Open Source Tools This page intentionally left blank Digital Forensics with Open Source Tools Cory Altheide Harlan Carvey Technical Editor Ray Davidson AMSTERDAM • BOSTON • HEIDELBERG • LONDON NEW YORK • OXFORD • PARIS • SAN DIEGO SAN FRANCISCO • SINGAPORE • SYDNEY • TOKYO Syngress is an imprint of Elsevier Acquiring Editor: Angelina Ward Development Editor: Heather Scherer Project Manager: Andre Cuello Designer: Joanne Blank Syngress is an imprint of Elsevier 225 Wyman Street, Waltham, MA 02451, USA © 2011 Elsevier, Inc. All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any information storage and retrieval system, without permission in writing from the publisher. Details on how to seek permission, further information about the Publisher’s permissions policies and our arrangements with organizations such as the Copyright Clearance Center and the Copyright Licensing Agency, can be found at our website: www.elsevier.com/permissions. This book and the individual contributions contained in it are protected under copyright by the Publisher (other than as may be noted herein). Notices Knowledge and best practice in this field are constantly changing. As new research and experience broaden our understanding, changes in research methods or professional practices, may become necessary. Practitioners and researchers must always rely on their own experience and knowledge in evaluating and using any information or methods described herein. In using such information or methods they should be mindful of their own safety and the safety of others, including parties for whom they have a professional responsibility. To the fullest extent of the law, neither the Publisher nor the authors, contributors, or editors, assume any liability for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions, or ideas contained in the material herein. Library of Congress Cataloging-in-Publication Data Application submitted British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library. ISBN: 978-1-59749-586-8 Printed in the United States of America 11 12 13 14 10 9 8 7 6 5 4 3 2 1 Typeset by: diacriTech, India For information on all Syngress publications visit our website at www.syngress.com Contents About the Authors .....................................................................................................xi Acknowledgments ...................................................................................................xiii Introduction ..............................................................................................................xv CHAPTER 1 Digital Forensics with Open Source Tools ��������������������������������� 1 Welcome to “Digital Forensics with Open Source Tools” ..............1 What Is “Digital Forensics?” ..........................................................1 Goals of Forensic Analysis .........................................................2 The Digital Forensics Process ....................................................3 What Is “Open Source?” .................................................................4 “Free” vs. “Open”.......................................................................4 Open Source Licenses ................................................................5 Benefits of Open Source Tools ........................................................5 Education ....................................................................................5 Portability and Flexibility ...........................................................6 Price ............................................................................................6 Ground Truth ..............................................................................7 Summary .........................................................................................7 References .......................................................................................8 CHAPTER 2 Open Source Examination Platform ������������������������������������������ 9 Preparing the Examination System .................................................9 Building Software.......................................................................9 Installing Interpreters ...............................................................10 Working with Image Files ........................................................10 Working with File Systems ......................................................10 Using Linux as the Host ................................................................10 Extracting Software ..................................................................11 GNU Build System ...................................................................12 Version Control Systems ..........................................................16 Installing Interpreters ...............................................................16 Working with Images ...............................................................19 Using Windows as the Host ..........................................................26 Building Software.....................................................................26 Installing Interpreters ...............................................................27 Working with Images ...............................................................31 Working with File Systems ......................................................34 Summary .......................................................................................37 References .....................................................................................37 vi Contents CHAPTER 3 Disk and File System Analysis ����������������������������������������������� 39 Media Analysis Concepts ..............................................................39 File System Abstraction Model ................................................40 The Sleuth Kit ...............................................................................41 Installing the Sleuth Kit ............................................................41 Sleuth Kit Tools ........................................................................42 Partitioning and Disk Layouts .......................................................52 Partition Identification and Recovery .......................................52 Redundant Array of Inexpensive Disks ....................................53 Special Containers .........................................................................54 Virtual Machine Disk Images ...................................................54 Forensic Containers ..................................................................55 Hashing .........................................................................................56 Carving ..........................................................................................58 Foremost ...................................................................................59 Forensic Imaging ...........................................................................61 Deleted Data .............................................................................61 File Slack ..................................................................................62 dd ..............................................................................................64 dcfldd ........................................................................................65dc3dd ........................................................................................66 Summary .......................................................................................67 References .....................................................................................67 CHAPTER 4 Windows Systems and Artifacts ��������������������������������������������� 69 Introduction ...................................................................................69 Windows File Systems ..................................................................69 File Allocation Table ................................................................69 New Technology File System ...................................................71 File System Summary ..............................................................77 Registry .........................................................................................78 Event Logs ....................................................................................84 Prefetch Files .................................................................................87 Shortcut Files ................................................................................89 Windows Executables ...................................................................89 Summary .......................................................................................93 References .....................................................................................93 CHAPTER 5 Linux Systems and Artifacts ��������������������������������������������������� 95 Introduction ...................................................................................95 Linux File Systems ........................................................................95 viiContents File System Layer .....................................................................96 File Name Layer .......................................................................99 Metadata Layer .......................................................................101 Data Unit Layer ......................................................................103 Journal Tools ..........................................................................103 Deleted Data ...........................................................................103 Linux Logical Volume Manager .............................................104 Linux Boot Process and Services ................................................105 System V ................................................................................105 BSD ........................................................................................107 Linux System Organization and Artifacts ...................................107 Partitioning .............................................................................107 Filesystem Hierarchy ..............................................................107 Ownership and Permissions ...................................................108 File Attributes .........................................................................109 Hidden Files ...........................................................................109 /tmp .........................................................................................109 User Accounts .............................................................................110 Home Directories ........................................................................112 Shell History ...........................................................................113 ssh ...........................................................................................113 GNOME Windows Manager Artifacts ...................................114 Logs .............................................................................................116 User Activity Logs .................................................................116 Syslog .....................................................................................117 Command Line Log Processing .............................................119 Scheduling Tasks .........................................................................121 Summary .....................................................................................121 References ...................................................................................121 CHAPTER 6 Mac OS X Systems and Artifacts ������������������������������������������ 123 Introduction .................................................................................123 OS X File System Artifacts .........................................................123 HFS+ Structures .....................................................................123 OS X System Artifacts ................................................................129 Property Lists .........................................................................129 Bundles ...................................................................................130 System Startup and Services ..................................................130 Kexts .......................................................................................131 Network Configuration ...........................................................131 Hidden Directories .................................................................132 viii Contents Installed Applications .............................................................133 Swap and Hibernation dataData .............................................133 System Logs ...........................................................................133 User Artifacts ..............................................................................134 Home Directories ...................................................................134 Summary .....................................................................................141 References ...................................................................................141 CHAPTER 7 Internet Artifacts ����������������������������������������������������������������� 143 Introduction .................................................................................143 Browser Artifacts ........................................................................143 Internet Explorer.....................................................................144 Firefox ....................................................................................147 Chrome ...................................................................................154 Safari ......................................................................................156 Mail Artifacts ..............................................................................161 Personal Storage Table ...........................................................161 mbox and maildir ...................................................................163 Summary .....................................................................................166 References ...................................................................................166 CHAPTER 8 File Analysis ����������������������������������������������������������������������� 169 File Analysis Concepts ................................................................169 Content Identification .............................................................170 Content Examination ..............................................................171 Metadata Extraction ...............................................................172 Images .........................................................................................175 JPEG .......................................................................................178GIF .........................................................................................183 PNG ........................................................................................184 TIFF ........................................................................................185 Audio ...........................................................................................185 WAV .......................................................................................185 MPEG-3/MP3 .........................................................................186 MPEG-4 Audio (AAC/M4A) .................................................186 ASF/WMA .............................................................................188 Video ...........................................................................................189 MPEG-1 and MPEG-2 ...........................................................189 MPEG-4 Video (MP4)............................................................189 AVI .........................................................................................190 ASF/WMV .............................................................................190 ixContents MOV (Quicktime) ..................................................................191 MKV .......................................................................................192 Archives ......................................................................................192 ZIP ..........................................................................................192 RAR ........................................................................................193 7-zip ........................................................................................195 TAR, GZIP, and BZIP2 ..........................................................195 Documents...................................................................................196 OLE Compound Files (Office Documents) ............................197 Office Open XML ..................................................................201 OpenDocument Format ..........................................................204 Rich Text Format ....................................................................205 PDF .........................................................................................206 Summary .....................................................................................210 References ...................................................................................210 CHAPTER 9 Automating Analysis and Extending Capabilities ������������������� 211 Introduction .................................................................................211 Graphical Investigation Environments ........................................211 PyFLAG .................................................................................212 Digital Forensics Framework .................................................221 Automating Artifact Extraction ...................................................229 Fiwalk .....................................................................................229 Timelines .....................................................................................231 Relative Times ........................................................................233 Inferred Times ........................................................................234 Embedded Times ....................................................................236 Periodicity ..............................................................................236 Frequency Patterns and Outliers (Least Frequency of Occurrence) ...................................................................237 Summary .....................................................................................239 References ...................................................................................239 APPEnDIX A Free, non-open Tools of note ���������������������������������������������� 241 Introduction .................................................................................241 Chapter 3: Disk and File System Analysis ..................................242 FTK Imager ............................................................................242 ProDiscover Free ....................................................................242 Chapter 4: Windows Systems and Artifacts ................................244 Windows File Analysis ...........................................................244 Event Log Explorer ................................................................244 Log Parser ...............................................................................245 x Contents Chapter 7: Internet Artifacts ........................................................247 NirSoft Tools ..........................................................................247 Woanware Tools .....................................................................247 Chapter 8: File Analysis ..............................................................248 Mitec.cz: Structured Storage Viewer ......................................248 OffVis .....................................................................................249 FileInsight ...............................................................................250 Chapter 9: Automating Analysis and Extending Capabilities.....250 Mandiant: Highlighter ............................................................250 CaseNotes ...............................................................................252 Validation and Testing Resources ...............................................253 Digital Corpora .......................................................................253 Digital Forensics Tool Testing Images ...................................253 Electronic Discovery Reference Model..................................254 Digital Forensics Research Workshop Challenges .................254 Additional Images ..................................................................254 References ...................................................................................255 Index ���������������������������������������������������������������������������������������������������������� 257 xi About the Authors Cory Altheide is a security engineer at Google, focused on forensics and incident response. Prior to Google, Cory was a principal consultant with MANDIANT, an information security consulting firm that works with the Fortune 500, the defense industrial base, and banks of the world to secure their networks and combat cyber crime. In this role he responded to numerous incidents for a variety of clients in addition to developing and delivering training to corporate and law enforcement customers. Cory also worked as the senior network forensics specialist in the National Nuclear Security Administration’s Information Assurance Response Center (NNSA IARC). In this capacity he analyzed potentially hostile code, performed wireless assessments of Department of Energy facilities, and researched new forensic tech- niques. He also developed and presented hands-on forensics training for various DoE entities and worked closely with members of the Southern Nevada Cyber Crimes Task Force to develop their skills in examining less common digital media. Cory has authored several papers for the computer forensics journal Digital Investigation and was a contributing author for UNIX and Linux Forensic Analysis (2008) and The Handbook of Digital Forensics and Investigation (2010). Addition- ally, Cory is a recurring member of the program committee of the Digital Forensics Research Workshop. Harlan Carvey (CISSP)is a vice president of Advanced Security Projects with Terremark Worldwide, Inc. Terremark is a leading global provider of IT infrastructure and “cloud computing” services based in Miami, Florida. Harlan is a key contributor to the Engagement Services practice, providing disk forensics analysis, consulting, and training services to both internal and external customers. Harlan has provided forensic analysis services for the hospitality industry and financial institutions, as well as federal government and law enforcement agencies. Harlan’s primary areas of interest include research and development of novel analysis solutions, with a focus on Windows platforms. Harlan holds a bachelor’s degree in electrical engineering from the Virginia Military Institute and a master’s degree in the same discipline from the Naval Postgraduate School. Harlan resides in Northern Virginia with his family. This page intentionally left blank xiii Acknowledgments Cory Altheide First off I want to thank Harlan Carvey. In addition to serving as my coauthor and sounding board, he has been a good friend and colleague for many years. He has proven to be one of the most consistently knowledgeable and helpful individuals I have met in the field. Harlan, thanks again for adding your considerable expertise to the book and for never failing to buy me a beer every time I see you. I also thank Ray Davidson for his work as technical editor. His early insights and commentary helped focus the book and made me target my subsequent writing on the intended audience. Tremendous thanks go out to the “usual suspects” that make the open source forensics world the wonderful place it is. First, thank you to Wietse Venema and Dan Farmer for creating open source forensics with “The Coroner’s Toolkit.” Thanks to Brian Carrier for picking up where they left off and carrying the torch to this day. Simson Garfinkel, you have my gratitude for providing the invaluable resource that is the Digital Forensics Corpora. Special thanks to Eoghan Casey, who first encouraged me to share my knowledge with the community many years ago. To my parents, Steve and Jeanine Altheide, thank you for buying my first Com- modore-64 (and the second… and the third). Thanks to my brother Jeremy Altheide and the Old Heathen Brewing Company for producing some of the finest beers around… someday. I express infinite gratitude to my incredible wife Jamie Altheide for her never- ending patience, love, and support during the research and writing of this book. Finally, I thank my daughters Winter and Lily for reminding me every day that I will never have all the answers, and that’s okay. Harlan Carvey I begin by thanking God for the many blessings He’s given me in my life, the first of which has been my family. I try to thank Him daily, but I find myself thinking that that’s not nearly enough. A man’s achievements are often not his alone, and in my heart, being able to write books like this is a gift and a blessing in many ways. I thank my true love and the light of my life, Terri, and my stepdaughter, Kylie. Both of these wonderful ladies have put up with my antics yet again (intently staring off into space, scribbling in the air, and, of course, my excellent imitations taken from some of the movies we’ve seen), and I thank you both as much for your patience as for being there for me when I turned away from the keyboard. It can’t be easy to have a nerd like me in your life, but I do thank you both for the opportunity to “put pen to paper” and get all of this stuff out of my head. Yes, that was a John Byrne reference. Finally, whenever you meet Cory, give him a thundering round of applause. This book was his idea, and he graciously asked me to assist. I, of course, jumped at the chance to work with him again. Thanks, Cory. This page intentionally left blank xv Introduction InTEnDED AuDIEnCE When writing a technical book, one of the first questions the authors must answer is “Who is your audience?” The authors must then keep this question in mind at all times when writing. While it is hoped that this book is useful to everyone that reads it, the intended audience is primarily two groups. The first group is new forensic practitioners. This could range from students who are brand new to the world of digital forensics, to active practitioners that are still early in their careers, to seasoned system administrators looking to make a career change. While this book is not a singular, complete compendium of all the forensic knowledge you will need to be successful, it is, hopefully, enough to get you started. The second audience is experienced digital forensics practitioners new to open source tools. This is a fairly large audience, as commercial, proprietary tools have had a nearly exhaustive hold on working forensic examiners. Many examiners oper- ating today are reliant upon a single commercial vendor to supply the bulk of their examination capabilities. They rely on one vendor for their core forensic platform and may have a handful of other commercial tools used for specific tasks that their main tool does not perform (or does not perform well). These experienced examiners who have little or no experience with open source tools will also hopefully benefit greatly from the content of this book. LAyOuT OF THE BOOk Beyond the introductory chapter that follows, the rest of this book is divided up into eight chapters and one Appendix. Chapter 2 discusses the Open Source Examination Platform. We walk through all the prerequisites required to start compiling source code into executable code, install interpreters, and ensure we have a proper environment to build software on Ubuntu and Windows. We also install a Linux emulation environment on Windows along with some additional packages to bring Windows closer to “feature parity” with Linux for our purposes. Chapter 3 details Disk and File System Analysis using the Sleuth Kit. The Sleuth Kit is the premier open source file system forensic analysis framework. We explain use of the Sleuth Kit and the fundamentals of media analysis, disk and par- tition structures, and file system concepts. We also review additional core digital forensics topics such as hashing and the creation of forensic images. Chapter 4 begins our operating system-specific examination chapters with Windows Systems and Artifacts. We cover analysis of FAT and NTFS file systems, including internal structures of the NTFS Master File Table, extraction and analysis of Registry hives, event logs, and other Windows-specific artifacts. Finally, because xvi Introduction malware-related intrusion cases are becoming more and more prevalent, we discuss some of the artifacts that can be retrieved from Windows executable files. We continue on to Chapter 5, Linux Systems and Artifacts, where we dis- cuss analysis of the most common Linux file systems (Ext2 and 3) and identifi- cation, extraction, and analysis of artifacts found on Linux servers and desktops. System level artifacts include items involved in the Linux boot process, service control scripts, and user account management. User-generated artifacts include Linux graphical user environment traces indicating recently opened files, mounted volumes, and more. Chapter 6 is the final operating system-specific chapter, in which we examine Mac OS X Systems and Artifacts. We examine the HFS+ file system using the Sleuth Kit as well as an HFS-specific tool, HFSXplorer. We also analyze the Property List files that make up the bulk of OS X configuration information and user artifacts. Chapter 7 reviews Internet Artifacts. Internet Explorer, Mozilla Firefox, Apple Safari, and Google Chrome artifacts are processed and analyzed, along with Outlook, Maildir, and mbox formatted localmail. Chapter 8 is all about File Analysis. This chapter covers the analysis of files that aren’t necessarily bound to a single system or operating system—documents, graphics files, videos, and more. Analysis of these types of files can be a big part of any investigation, and as these files move frequently between systems, many have the chance to carry traces of their source system with them. In addition, many of these file formats contain embedded information that can persist beyond the destruction of the file system or any other malicious tampering this side of wiping. Chapter 9 covers a range of topics under the themes of Automating Analysis and Extending Capabilities. We discuss the PyFLAG and DFF graphical inves- tigation environments. We also review the fiwalk library designed to take the pain out of automated forensic data extraction. Additionally, we discuss the generation and analysis of timelines, along with some alternative ways to think about temporal analysis during an examination. The Appendix discusses some non-open source tools that fill some niches not yet covered by open source tools. These tools are all available free of charge, but are not provided as open source software, and as such did not fit directly into the main content of the book. That said, the authors find these tools incredibly valuable and would be remiss in not including some discussion of them. WHAT IS nOT COvERED While it is our goal to provide a book suitable for novice-to-intermediate examiners, if you do not have any experience with Linux at the command line, you may find it difficult to follow along with the tool use examples. While very few of the tools cov- ered are Linux specific, most of the tool installation and subsequent usage examples are performed from a Linux console. xviiIntroduction We focus exclusively on dead drive forensic analysis—media and images of sys- tems that are offline. Collection and analysis of volatile data from running systems are not covered. Outside of the Linux platform, current tools for performing these tasks are largely closed source. That said, much of the analysis we go through is equally applicable to artifacts and items recovered from live systems. Low-level detail of file system internals is intentionally omitted as this material is covered quite well in existing works. Likewise the development of open source tools is not discussed at length here. This is a book that first and foremost is concerned with the operational use of existing tools by forensic practitioners. Outside of the Appendix, no commercial, proprietary, closed source, or otherwise restricted software is used. This page intentionally left blank CHAPTER 1 Digital Forensics with Open Source Tools InFORMATIOn In THIS CHAPTER • Welcome to “Digital Forensics with Open Source Tools” • What Is “Digital Forensics?” • What Is “Open Source?” • Benefits of Open Source Tools WELCOME TO “DIgITAL FOREnSICS WITH OPEn SOuRCE TOOLS” In digital forensics, we rely upon our expertise as examiners to interpret data and information retrieved by our tools. To provide findings, we must be able to trust our tools. When we use closed source tools exclusively, we will always have a veil of abstraction between our minds and the truth that is impossible to eliminate. We wrote this book to fill several needs. First, we wanted to provide a work that demonstrated the full capabilities of open source forensics tools. Many examiners that are aware of and that use open source tools are not aware that you can actually perform a complete investigation using solely open source tools. Second, we wanted to shine a light on the persistence and availability (and subsequent examination) of a wide variety of digital artifacts. It is our sincere hope that the reader learns to under- stand the wealth of information that is available for use in a forensic examination. To continue further, we must define what we mean by “Digital Forensics” and what we mean by “Open Source.” WHAT IS “DIgITAL FOREnSICS?” At the first Digital Forensics Research Workshop (DFRWS) in 2001, digital forensics was defined as: The use of scientifically derived and proven methods toward the preservation, collection, validation, identification, analysis, interpretation, documentation and 1 2 CHAPTER 1 Digital Forensics with Open Source Tools presentation of digital evidence derived from digital sources for the purpose of facilitating or furthering the reconstruction of events found to be criminal, or helping to anticipate unauthorized actions shown to be disruptive to planned operations [1]. While digital forensics techniques are used in more contexts than just criminal investigations, the principles and procedures are more or less the same no matter the investigation. While the investigation type may vary widely, the sources of evidence gen- erally do not. Digital forensic examinations use computer-generated data as their source. Historically this has been limited to magnetic and optical storage media, but increasingly snapshots of memory from running systems are the subjects of examination. Digital forensics is alternately (and simultaneously!) described as an art and a science. In Forensic Discovery, Wietse Venema and Dan Farmer make the argument that at times the examiner acts as a digital archaeologist and, at other times, a digital geologist. Digital archaeology is about the direct effects from user activity, such as file con- tents, file access time stamps, information from deleted files, and network flow logs. … Digital geology is about autonomous processes that users have no direct control over, such as the allocation and recycling of disk blocks, file ID numbers, memory pages or process ID numbers [2]. This mental model of digital forensics may be more apropos than the “digital ballistics” metaphor that has been used historically. No one ever faults an archaeolo- gist for working on the original copy of a 4000-year-old pyramid, for example. Like archaeology and anthropology, digital forensics combines elements from “hard” or natural science with elements from “soft” or social science. Many have made the suggestion that the dichotomy of the art and science of forensic analysis is not a paradox at all, but simply an apparent inconsistency arising from the conflation of the two aspects of the practice: the science of forensics com- bined with the art of investigation. Applying scientific method and deductive reason- ing to data is the science—interpreting these data to reconstruct an event is the art. On his Web site, Brian Carrier makes the argument that referring to the practice as “digital forensics” may be partially to blame for some of this. While traditional crime scene forensic analysts are tasked with answering very discrete questions about subsets of evidence posed to them by detectives, digital forensic examiners often wear both hats. Carrier prefers the term “digital forensic investigation” to make this distinction clear [3]. goals of Forensic Analysis The goal of any given forensic examination is to find facts, and via these facts to rec- reate the truth of an event. The examiner reveals the truth of an event by discovering and exposing the remnants of the event that have been left on the system. In keeping with the digital archaeologist metaphor, these remnants are known as artifacts. These remnants are sometimes referred to as evidence. As the authors deal frequently with 3What Is “Digital Forensics?” lawyers in writing, we prefer to avoid overusing the term evidence due to the loaded legal connotations. Evidence is something to be used during a legal proceeding, and using this term loosely may get an examiner into trouble. Artifacts are traces left behind due to activities and events,which can be innocuous, or not. As stated by Locard’s exchange principle, “with contact between two items, there will be an exchange [4].” This simple statement is the fundamental principle at the core of evidence dynamics and indeed all of digital forensics. Specific to digital forensics, this means that an action taken by an actor on a computer system will leave traces of that activity on the system. Very simple actions may simply cause registers to change in the processor. More complex actions have a greater likelihood of creating longer-lasting impressions to the system, but even simple, discreet tasks can create artifacts. To use a real-world crime scene investigation analogy, kicking open a door and picking a lock will both leave artifacts of their actions (a splintered door frame and microscopic abrasions on the tumblers, respectively). Even the act of cleaning up artifacts can leave additional artifacts—the digital equivalent to the smell of bleach at a physical crime scene that has been “washed.” It is important to reiterate the job of the examiner: to determine truth. Every examination should begin with a hypothesis. Examples include “this computer was hacked into,” “my spouse has been having an affair,” or “this computer was used to steal the garbage file.” The examiner’s task is not to prove these assertions. The examiner’s task is to uncover artifacts that indicate the hypothesis to be either valid or not valid. In the legal realm, these would be referred to as inculpatory and exculpa- tory evidence, respectively. An additional hitch is introduced due to the ease with which items in the digi- tal realm can be manipulated (or fabricated entirely). In many investigations, the examiner must determine whether or not the digital evidence is consistent with the processes and systems that were purported to have generated it. In some cases, deter- mining the consistency of the digital evidence is the sole purpose of an examination. The Digital Forensics Process The process of digital forensics can be broken down into three categories of activity: acquisition, analysis, and presentation. • Acquisition refers to the collection of digital media to be examined. Depending on the type of examination, these can be physical hard drives, optical media, stor- age cards from digital cameras, mobile phones, chips from embedded devices, or even single document files. In any case, media to be examined should be treated delicately. At a minimum the acquisition process should consist of creating a duplicate of the original media (the working copy) as well as maintaining good records of all actions taken with any original media. • Analysis refers to the actual media examination—the “identification, analysis, and interpretation” items from the DFRWS 2001 definition. Identification con- sists of locating items or items present in the media in question and then further reducing this set to items or artifacts of interest. These items are then subjected 4 CHAPTER 1 Digital Forensics with Open Source Tools to the appropriate analysis. This can be file system analysis, file content exami- nation, log analysis, statistical analysis, or any number of other types of review. Finally, the examiner interprets results of this analysis based on the examiner’s training, expertise, experimentation, and experience. • Presentation refers to the process by which the examiner shares results of the analysis phase with the interested party or parties. This consists of generating a report of actions taken by the examiner, artifacts uncovered, and the meaning of those artifacts. The presentation phase can also include the examiner defending these findings under challenge. Note that findings from the analysis phase can drive additional acquisitions, each of which will generate additional analyses, etc. This feedback loop can continue for numerous cycles given an extensive network compromise or a long-running criminal investigation. This book deals almost exclusively with the analysis phase of the process, although basic acquisition of digital media is discussed. WHAT IS “OPEn SOuRCE?” Generically, “open source” means just that: the source code is open and available for review. However, just because you can view the source code doesn’t mean you have license to do anything else with it. The Open Source Initiative has created a formal definition that lays out the requirements for a software license to be truly open source. In a nutshell, to be considered open source, a piece of software must be freely redistributable, must provide access to the source code, must allow the end user to modify the source code at will, and must not restrict the end use of the software. For more detail, see the full definition at the Open Source Initiative’s site [5]. “Free” vs� “Open” Due to the overloading of the word “free” in the English language, confusion about what “free” software is can arise. Software available free of charge (gratis) is not necessarily free from restriction (libre). In the open source community, “free soft- ware” generally means software considered “open source” and without restriction, in addition to usually being available at no cost. This is in contrast to various “freeware” applications generally found on Windows system available solely in binary, execut- able format but at no cost. nOTE Free for Some Note that under the Open Source Initiative’s definition, any license that restricts the use of software for certain tasks or that restricts distribution among certain groups cannot be an open source license. This includes the “Law Enforcement Only” or “Non-Commercial Use” restrictions commonly placed on freeware tools in the digital forensics community. 5Benefits of Open Source Tools This core material of this book is focused on the use of open source software to perform digital forensic examinations. “Freeware” closed source applications that perform a function not met by any available open source tools or that are otherwise highly useful are discussed in the Appendix. Open Source Licenses At the time of this writing, there are 58 licenses recognized as “Open Source” by the Open Source Initiative [6]. Since this is a book about the use of open source software and not a book about the intricacies of software licensing, we briefly discuss the most commonly used open source licenses. The two most commonly used licenses are the GNU Public License (GPL) and the Berkeley Software Distribution License (BSD). To grossly simplify, the core difference between these two licenses is that the GPL requires that any modifications made to GPL code that is then incorporated into distributed compiled software be made available in source form as well. The BSD license does not have this requirement, instead only asking for acknowledgment that the distributed software contains code from a BSD-licensed project. This means that a widget vendor using GPL-licensed code in their widget con- troller code must provide customers that purchase their widgets the source code upon request. If the widget was driven using BSD license software, this would not be necessary. In other words, the GPL favors the rights of the original producer of the code, while the BSD license favors the rights of the user or consumer of the code. Because of this requirement, the GPL is known as a copyleft license (a play on “copyright”). The BSD license is what is known as a permissive license. Most permissive licenses are considered GPL compatible because they give the end user authority over what he or she does with the code, including using it in derivative works that are GPL licensed. Additional popular GPL-compatible licenses include the Apache Public License (used by Apache Foundation projects) and the X11/MIT License. BEnEFITS OF OPEn SOuRCE TOOLS Thereare great many passionate screeds about the benefits of open source software, the ethics of software licensing, and the evils of proprietary software. We will not repeat them here, but we will outline a few of the most compelling reasons to use open source tools that are specific to digital forensics. Education When the authors entered the digital forensics field, there were two routes to becom- ing an examiner. The first was via a law enforcement or military career, and the second was to teach yourself (with the authors representing each of these routes). In either scenario, one of the best ways to learn was by using freely available tools 6 CHAPTER 1 Digital Forensics with Open Source Tools (and in the self-taught scenario, the only way!). Today, there are numerous college programs and training programs available to an aspiring examiner, but there is still something to be said for learning by doing. The authors have been using open source tools throughout their careers in digital forensics, and we both have no doubt that we are far better examiners than we would have been otherwise. Using open source tools to learn digital forensics has several benefits. First, open source tools innately “show their work.” You can execute the tool, examine the options and output, and finally examine the code that produced the output to understand the logic behind the tool’s operation. For the purposes of small examina- tion scenarios, you can run the tools on any old hardware you have access to—no multithousand dollar deluxe forensic workstation required. Finally, you also have access to a dedicated community of examiners, developers, and enthusiasts ready to help you—provided you’ve done a modicum of legwork before firing off questions answered trivially by a Google search. Portability and Flexibility Another key benefit to the tools covered in this book by and large is that they are all portable and flexible. By portable we mean that you can easily take your toolkit with you as you move from one system to another, from one operating system to another, or from one job to another. Unless you personally license an expensive proprietary tool, your toolkit may not come with you if you move from one company to another. Any product-specific expertise you built up could end up worthless. If you are currently employed in law enforcement, any law enforcement–only tools you are currently using won’t be available to you should you decide to go into the private sector. If portability means you can choose where you use your tools, flexibility means you can choose how you use your tools. You can use open source tools on your local system or you can install them on a remote server and use them over a remote shell. You can install them on a single system or you can install them on thousands of systems. You can do all this without asking the software provider for permissions, without filling out a purchase order, and without plugging a thousand hardware copy protection dongles into a thousand machines. Price In addition to being open source, all of the tools covered in this work are free of cost. This is great for individuals looking to learn forensics on their own, students taking formal coursework in digital forensics, or examiners looking to build a digital forensics capability on a budget. This is also a great benefit for anyone already using a full complement of commercial tools. Adding a set of open source tools to your toolkit will usually cost you nothing, save for a bit of time. Even if you continue using proprietary, commercial tools on a daily basis, you can use the tools in this book as an adjunct to cover gaps in your tools coverage or to validate or calibrate your tools’ findings and operation. 7Summary ground Truth Arguably the biggest benefit open source software provides to the examiner is the code itself. As renowned philosopher Alfred Korzybski once said, “the map is not the territory.” Being able to review the source code that you then compile into a work- ing program is invaluable. If you have the skill and desire, you can make changes to the function of the code. You can verify fixes and changes between versions directly without having to simply trust what your software provider is telling you. Using the Sleuth Kit as an example, we have no less than three different ways to review bug fixes in the software. First, we can review the change log files included with each release. Most proprietary software vendors will include something similar when a new version is released. Second, we can review the freely accessible bug trackers maintained at the Sleuth Kit project site [7]. Most proprietary forensic soft- ware vendors will not provide open access to their full list of bugs and fixes. Finally, we can take the previous version of the source code and compare it with the newer version automatically via the diff command, highlighting exactly which changes have occurred. The first option is reading the map. The last option is surveying the territory. Additionally, with open source software the function of the code can be reviewed directly. The authors have had experiences where proprietary forensic products have produced demonstrably erroneous results, but these tests were per- formed in a “black box” scenario. Known input files were generated and processed, and precalculated expected results compared to the output from the proprietary tool, and false negatives were discovered. The authors had to bypass the tool’s internal logic and implement correct function externally. Had the tool been open source, the error in processing could have been identified directly in the code, fixed, and subsequently fixed in the main code repository, identifying and solving the problem for all users of the code. In the previous scenario, the lack of access to the source code acted as an addi- tional layer of abstraction between the examiners and the truth. Each layer of abstrac- tion is a possible source for error or distortion. Since the goal of the examiner is to uncover truth, it is in the examiner’s interest to ensure that the possible layers of abstraction are minimized. If your findings are ever brought into question, being able to show the actual source code used to generate data you interpreted could be incredibly valuable. SuMMARy The world of digital forensics has a long history of relying heavily on closed-source tools. Armed with an understanding of what we are doing, why we are doing it, and why we choose to use the tools we do, we can move on to the next chapter and begin building an open source examination platform. 8 CHAPTER 1 Digital Forensics with Open Source Tools References [1] A Road Map for Digital Forensic Research, Digital Forensics Research Workshop, 2001 (p. 16), (accessed 05.01.11). [2] Dan Farmer & Wietse Venema, Forensic Discovery—Chapter 1: The spirit of forensic discovery. http://www.porcupine.org/forensics/forensic-discovery/chapter1.html, (accessed 05.01.11). [3] Digital Investigation and Digital Forensic Basics. http://www.digital-evidence.org/ di_basics.html. [4] W. Jerry Chisum, & Brent E. Turvey, Profiling 1(1) (2000). http://www.profiling.org/ journal/vol1_no1/jbp_ed_january2000_1-1.html, (accessed 05.01.11). [5] The Open Source Definition|Open Source Initiative, http://opensource.org/docs/osd. [6] Open Source Licenses by Category|Open Source Initiative. http://opensource.org/licenses/ category. [7] Trackers—SleuthKitWiki. http://wiki.sleuthkit.org/index.php?title=Trackers. (accessed 07.01.2011). CHAPTER 9 Open Source Examination Platform InFORMATIOn In THIS CHAPTER • Preparing the Examination System • Using Linux as the Host • Using Windows as the Host PREPARIng THE EXAMInATIOn SySTEM Before using many of the open source forensicstools explored in the chapters to come, you will need to ensure that your system is prepared to build said tools. As they are “open source,” the bulk of these tools are distributed primarily in source form, that is, you’ll need to generate the executable code yourself. In other cases, the tools will be scripts that require a specific interpreter to run. This chapter deals with the setup required to perform examinations with open source tools using Linux and Windows hosts. For each platform we will go through the following steps. Building Software Because we are going to be working with open source software, we are going to need to be able to take that source code and convert it into usable form. At a high level this is known as building, and we will need to have one or more working build envi- ronments on the systems we are planning to use open source applications on. This chapter sets up a generic development environment that can be used to build open source applications written in the C and C++ languages and to build (or install) any libraries these applications require. 2 10 CHAPTER 2 Open Source Examination Platform Installing Interpreters Some of the open source applications we will be using are written in interpreted (or “scripting”) languages, such as Perl, Python, or Ruby. To run these programs we will need the appropriate interpreter, and usually we will need some means to install prerequisite modules that the applications rely upon. Working with Image Files A key part of forensic examination is working with image files—forensic copies of media. This is easier on some platforms, but is necessary on all—if we can’t open the container, we can’t get at the contents. An important part of setting up an examina- tion system is ensuring that you can access work with image files directly. While the forensic tools worked with in later chapters can access image files directly, having multiple means to access them is an effective means of verifying the operation of our tools. Having this capability also serves as a hedge in case our forensic tools fail to process a given image file properly. There are two general classes of image files we will be working with: raw images and forensic containers. Raw image files are exactly what they sound like— a pure bit-for-bit copy of source media. Forensic containers are special file formats designed with forensic use in mind. Generally these will contain equivalent data found in a raw image and will also contain checksum information or metadata about the container. In addition, forensic container formats may allow for compression and encryption. Working with File Systems Finally, there will be times when we as examiners will need to interact with the file systems contained in image files using native system functionality (but still in a forensically sound manner). For example, to examine an Ext4 file system we’ll need to use native system capabilities rather than a forensic utility simply because (at the time of writing) there are no open source forensic utilities that can interpret Ext4 file systems. uSIng LInuX AS THE HOST Using Linux as the base is the most common way to set up an open source forensics platform, and throughout the book will be our “supported” use case. In the examples we are using Ubuntu, but if you have a different distribution you prefer using already it’s entirely possible to do so. We will use the Ubuntu package manager apt to install software throughout the book; however, none of the software we are installing is Ubuntu-specific, and most of the packages we install should be available via the package manger in your distribution of choice. 11Using Linux as the Host Extracting Software Linux source code is usually distributed in compressed archives known as tarballs. To extract these we will use the tar command along with a handful of flags. To extract tarballs with tgz or tar.gz extensions (GZippped tarballs), use the command: tar xzf {filename} To extract tarballs with tbz, tbz2, tar.bz2, or tar.bz extensions (BZipped tarballs), use this command: tar xjf {filename} In either case you can add the -v option for verbose mode, which will display the name and path of extracted files to the console as the command progresses. We’ll need to install a few pieces of software before we can begin building pro- grams from source code. On Ubuntu, the “build-essential” meta-package will get us started. Build-essential is basically a grouping of packages essential for building software. To install this package we’ll used the apt-get command. The apt-get command is part of the “Advanced Packaging Tool,” the main method for installing precompiled software packages on Debian-derived Linux distributions (including Ubuntu). Because installing packages is a system administration function, on Linux systems we will need to act with super-user or root privileges in order to do so. We will use the sudo command to run apt-get with super-user privileges. TIP Brief Linux Command Line Refresher Ideally you will come into this book with some Linux experience. If it’s been a while (or you’re brand new), here’s a refresher list of the basic terminal commands you’ll need to use to navigate and work with files: • cd changes directories. “cd ..” goes up a directory, “cd /” goes to the top of the directory structure, and “cd ~” goes to your home directory. • ls lists the contents of a directory (equivalent to “dir” in a Windows command prompt). “ls” will list the current directory, and “ls –1” will provide a verbose listing. • pwd will print the current directory you are in, in case you get lost. • mkdir will create a new directory • cp will copy a file. “cp –r” will copy a directory and all items in the subdirectory. • mv will rename (or, move) a file or directory • rm will delete (or, remove) a file. “rm –r” is required to delete a directory (and all its subdirectories!) • cat will dump the contents of a file to the screen. Long files can be viewed a page at a time using less or more. • The pipe character “|” is used to chain the output from one command to the input of the next. The greater than sign “>” is used to send the output to a named file instead of the screen. Double arrows “>>” append the output instead of overwriting. • Finally, man and info can be used to get usage information for any command. 12 CHAPTER 2 Open Source Examination Platform user@ubuntu:~$ sudo apt-get install build-essential [sudo] password for user: The following extra packages will be installed: dpkg-dev fakeroot g++ g++-4.4 libstdc++6-4.4-dev patch xz-utils Suggested packages: debian-keyring debian-maintainers g++-multilib g++-4.4- multilib gcc-4.4-doc libstdc++6-4.4-dbg libstdc++6-4.4-doc diffutils-doc The following NEW packages will be installed: build-essential dpkg-dev fakeroot g++ g++-4.4 libstdc++6-4.4- dev patch xz-utils 0 upgraded, 8 newly installed, 0 to remove and 0 not upgraded. Need to get 7571kB of archives. After this operation, 24.6MB of additional disk space will be used. Do you want to continue [Y/n]? Y While we now have the basics of our build environment installed, we will come back to the apt-get command to install development libraries required by many of the applications we will be installing later. Most open source applications will come with a README or INSTALL document that will contain information regarding what additional libraries. Be sure to reference this prior to attempting to build software. For more information on installing software on Ubuntu, please see the Ubuntu Help Guide [1]. gnu Build System The majority of the software we will be building uses the “GNU Autotools” system to prepare and execute a build [2]. Building software that uses this system generally consists of running threecommands, in sequence: 1� ./configure 2� make 3� (sudo) make install Configure If the application being built has any configurable options, the included configure script is the method we will use to set them. Generally the configure script will respond to the --help flag by displaying all the available configuration options. We can use the configure script from LibEWF library as an example. We will discuss the operation of this library in detail shortly—for now it’s enough to know that it is a prerequisite for the operation of our forensic platform. We have truncated the output that follows due to space considerations—most configure scripts will allow you to tune many options related to the building and subsequent installation of the software. 13Using Linux as the Host user@ubuntu:~/src/libewf-20100226$ ./configure --help 'configure' configures libewf 20100226 to adapt to many kinds of systems. Usage: ./configure [OPTION]... [VAR=VALUE]... ... Optional Features: ... --enable-wide-character-type enable wide character type support (default is no) --enable-static-executables build the ewftools as static executables (default is no) --enable-low-level-functions use libewf's low level read and write functions in the ewftools (default is no) --enable-low-memory-usage enable low memory usage (default is no) --enable-verbose-output enable verbose output (default is no) --enable-debug-output enable debug output (default is no) --enable-python build python bindings (pyewf) (default is no) --enable-v2-api enable experimental version 2 API (default is no) ... Here we can see handful of libewf-specific configuration options that may be of inter- est to us. Referencing the included README file tells us that libewf relies on zlib (the deflate/zip compression library) and libcrypto (the OpenSSL library). Different distribu- tions will have these libraries packaged under different names, but in general these can be located fairly easily by searching for the development name of the libraries at hand using the apt-cache search command or equivalent command for your distribution. user@ubuntu:~$ apt-cache search openssl | grep dev libssl-ocaml-dev - OCaml bindings for OpenSSL libcurl4-openssl-dev - Development files and documentation for libcurl (OpenSSL) libssl-dev - SSL development libraries, header files and documentation user@ubuntu:~$ apt-cache search zlib | grep dev lib32z1-dev - compression library - 32 bit development libmng-dev - M-N-G library (Development headers) libzlcore-dev - zlibrary core - development files libzltext-dev - zlibrary text model/viewer - development files zlib1g-dbg - compression library - development zlib1g-dev - compression library - development libcryptokit-ocaml-dev - cryptographic algorithm library for OCaml - development libniftiio1-dev - IO libraries for the NIfTI-1 data format libtrf-tcl-dev - Tcl data transformations - development files libzzip-dev - library providing read access on ZIP-archives - development 14 CHAPTER 2 Open Source Examination Platform From the results, we see that we need the “zlib1g-dev” and “libssl-dev” libraries, which can be installed using the following command: user@ubuntu:~$ sudo apt-get install zlib1g-dev libssl-dev ... Setting up libssl-dev (0.9.8k-7ubuntu8) ... With our libraries installed, we are ready to execute the configure script. Upon execution, the configure script will check the build system to ensure that all the librar- ies required to build (and subsequently execute) the program are present, functional, and of the correct version. user@ubuntu:~/src/libewf-20100226$ ./configure --enable-wide- character-type --enable-low-level-functions checking for a BSD-compatible install... /usr/bin/install -c checking whether build environment is sane... yes ... config.status: executing libtool commands configure: Building: libuna support: local libbfio support: local libcrypto EVP support: yes libcrypto MD5 support: evp libcrypto SHA1 support: evp guid support: libuuid Features: Wide character type support: yes ewftools are build as static executables: no ewftools use low level read and write functions: yes Python (pyewf) support: no Verbose output: no Debug output: no Experimental version 2 API: no Note that this particular configuration script provides the name of the library providing the included functionality: “guid support: libuuid.” If the README or INSTALL documentation is missing, incomplete, or simply incorrect, simply attempting to run the configure script is a trial-and-error method that may provide more information about what libraries you need to complete the build. WARnIng “�/configure not found” Very simple programs that don’t have any third-party library dependencies may not have a configure script. Some of these may provide a prebuilt makefile, but if this isn’t the case you may need to compile the code directly. Check for a README or INSTALL document or, barring that, read the source itself for an indication of how to proceed with building the software. 15Using Linux as the Host A successful execution of the configure script will generate a “makefile”—a build script read by the make command, which brings us to the next step in the build process. Make The make command reads through the “makefile” generated by the configure script, and proceeds to compile and link each of the executable files and libraries that make up the program at hand. This can take some time and will generate a lot of output while it is executing. A highly edited sample of what you can expect from a typical make execution appears here: user@ubuntu:~/src/libewf-20100226$ make Making all in include make[1]: Entering directory '/home/user/src/libewf-20100226/ include' ... make[1]: Entering directory '/home/user/src/libewf-20100226/ liberror' /bin/bash ../libtool --tag=CC -- mode=compile gcc -DHAVE_CONFIG_H -I. -I../common -I../include -I../common- g -O2 -Wall -MT liberror_error.lo -MD -MP -MF .deps/liberror_error.Tpo -c -o liberror_error.lo liberror_error.c libtool: compile: gcc -DHAVE_CONFIG_H -I. -I../common -I../ include -I../common -g -O2 -Wall -MT liberror_error.lo -MD -MP -MF .deps/liberror_error.Tpo -c liberror_error.c -fPIC -DPIC -o .libs/liberror_error.o ... make[1]: Leaving directory '/home/user/src/libewf-20100226' Once the “make” completes, the final step is to actually install the application. Executing sudo make install will copy the finished executables, libraries, documentation (if present), and any additional materials to their configured locations—generally under the “/usr/local/” directory. nOTE Other Build Systems GNU Autotools does not stand alone in the pantheon of build systems for open source software. While it is the most venerable and still most common build system in use, there are others in use in various projects. Two of the most common are cmake and scons. Scons is python based and is therefore a popular build system for python-heavy programs. Cmake is an intermediary build layer used by cross-platform applications—it generates native build files for each target—Makefiles for Unix-like systems, Visual Studio Solution files for Windows targets, etc. 16 CHAPTER 2 Open Source Examination Platform version Control Systems In addition to packaged tarballs of source code, many open source projects are avail- able via version control systems. These services enable tracking of code changes among a distributed group of participants. Version control systems offer many capabilities geared toward ensuring clean and easy collaboration on development; however, for our use, we will only be “checkingout” code—retrieving a copy of the source code from the repository. The end result will be a directory tree of code similar to what we would have after extracting a tarball. Sometimes, code that is still under active development may only be available via a source checkout from a version control system; in other cases, the develop- ment version of an application may have capabilities required to perform a successful examination. As always, validate that any tools you use perform the functions they are designed for in a verifiable, repeatable manner, but above all work on copies or extracted data rather than original data. Popular open source version control systems include • cvs • subversion • git • mercurial We will discuss the operation of these tools to perform source code checkouts when we build tools that require this. Installing Interpreters In addition to compiling executable code, we will need to be able to execute pro- grams written in interpreted languages. To do so, we will need to install the appropri- ate interpreters—Perl, Python, and Ruby. On most Linux distributions the Perl and Python interpreters (and a handful of modules) will be already be installed. We’ll want to install the Python 3 interpreter in addition to our currently installed version, and we’ll need to install the Ruby interpreter. We will also explore how to install various modules in each of the interpreters. Perl Perl is one of the older nonshell scripting languages still in common use. Its longevity is one of its core strengths—over many years of use Perl has built up an impressive number of open source libraries and modules available for reuse. TIP CERT Forensic Tools Repository The examples throughout this book use Ubuntu as our base operating system. If you are a Fedora user, you may want to use the CERT Linux Forensics Tools Repository. This repository provides a prebuilt set of forensic packages for Fedora 10, 11, 12, and 13, including many of the tools discussed throughout this book. http://www.cert.org/forensics/tools/ 17Using Linux as the Host To check our installed version of perl we can issue the following command: user@ubuntu:~$ perl -v This is perl, v5.10.1 built for x86_64-linux-gnu-thread-multi The core repository for Perl modules is known as CPAN (the Comprehensive Perl Archive Network). Packages in CPAN can be installed from the terminal using the -MCPAN option to perl. Upon executing this command for the first time your CPAN preferences will be set up—defaults are fine for our usage so hit enter when prompted to accept them. user@ubuntu:~$ perl -MCPAN -e shell ... cpan[1]> help Display Information (ver 1.9402) command argument description cpan [n] quit ... Python Like Perl, Python will be present on most Linux distributions by default. We can check the python version we have installed with the -V flag: user@ubuntu:~$ python -V Python 2.6.5 In addition to the Python 2.6 interpreter we want a parallel installation of the Python 3 interpreter. Python 3 represents a major change from the 2.x series and as such is not directly backward compatible with existing programs written for Python 2.x. Because we will be using a few programs targeted for the newer Python, we will need both. We can install Python 3 directly using the following command: user@ubuntu:~$ sudo apt-get install python3-minimal [sudo] password for user: Reading package lists... Done Building dependency tree Reading state information... Done The following extra packages will be installed: python3.1 python3.1-minimal Suggested packages: python3.1-doc python3.1-profiler The following NEW packages will be installed: python3-minimal python3.1 python3.1-minimal 0 upgraded, 3 newly installed, 0 to remove and 0 not upgraded. Need to get 4,995kB of archives. After this operation, 17.7MB of additional disk space will be used. Do you want to continue [Y/n]? 18 CHAPTER 2 Open Source Examination Platform Unlike Perl and Ruby, Python doesn’t have a “standard” package management system. Python modules are instead expected to be handled by the operating system’s package manager or installed by the user manually. As we use programs that need specific packages we will install using both methods. That said, Python does have a centralized packaged repository, and there are several unofficial package managers available that leverage this repository. The most widely used is easy_install, provided by the “python-setuptools” package. We can install this for Python 2.x and 3.x using the following command: user@ubuntu:~$ sudo apt-get install python-setuptools python 3-setuptools Ruby Ruby is the third scripting language we will need to ensure is installed. As a younger language, it is not present by default on our Ubuntu installation: user@ubuntu:~$ ruby -v The program 'ruby' is currently not installed. You can install it by typing: sudo apt-get install ruby As just shown, we can install via apt-get. Once this is completed, we can verify the install and check the version with the -v option. user@ubuntu:~$ ruby -v ruby 1.8.7 (2010-01-10 patchlevel 249) [i486-linux] Ruby packages are managed via RubyGems. This needs to be installed separately: user@ubuntu:~$ sudo apt-get install rubygems The package manager is invoked via the gem command: user@ubuntu:~$ gem RubyGems is a sophisticated package manager for Ruby. This is a basic help message containing pointers to more information. Usage: gem -h/--help gem -v/--version gem command [arguments...] [options...] Examples: gem install rake gem list --local gem build package.gemspec gem help install ... We will use tools that require each of these interpreters more throughout the book. For now, ensuring that we have them installed, operational, and can install packages is sufficient. 19Using Linux as the Host Working with Images Although we will be using forensic utilities that can interpret the file system on a raw image, it is in our interest to ensure that we can work with image files using native system functionality as well. This enables us to test our forensic tools for accuracy, provides us a much needed “safety net” in the event our tools don’t function properly, and, in some cases, may be the most useful way to access data of interest. Any given Linux distribution should have the capability to work with raw image files natively. We will use the losetup command to create a “loop device” associ- ated with our disk image. A loop device is a virtual device that allows a disk image to be treated as if it were an actual disk. user@ubuntu:~$ losetup Usage: losetup loop_device # give info losetup -a | --all # list all used losetup -d | --detach loop_device # delete losetup -f | --find # find unused losetup [ options ] {-f|--find|loop_device} file # setup Options: -e | --encryption <type> enable data encryption with specified <name/num> -h | --help this help -o | --offset <num> start at offset <num> into file -p | --pass-fd <num> read passphrase from file descriptor <num> -r | --read-only setup read-only loop device -s | --show print device name (with -f <file>) -N | --nohashpass Do not hash the given password (Debian hashes) -k | --keybits <num> specify number of bits in the hashed key given WARnIng Automounting of volumes on ubuntu One of the major benefits of using Linux as a forensic platform is the tremendous power the system provides to an educated examiner. However, as Linux distributions become more and more “user-friendly,” the operation of system becomes more and more abstracted away and difficult to access. One example is preventing the automatic mounting of external media. Historically,
Compartilhar