Tutorial 1

Macintosh Forensics- What You Need to Know?
Dr. Avinash Srinivasan , Bloomsburg University of Pennsylvania, USA

Digital Forensic investigation refers to the process of gathering, preserving, analyzing and presenting digital evidence in an acceptable manner in the court of law. The domain of Digital Forensics is very intricate and multi-faceted. It is very broad in nature and encompasses a wide range of subfields including - Data Mining, Artificial Intelligence, High Performance Computing, Distributed and Parallel Computing, etc. Though Digital forensics is still in its infancy, it has already carved its niche in cutting edge research and drawn significant attention from academia, industry, and in particular from law enforcement.

A meticulous and forensically sound investigation requires the examiner to have a thorough understanding of the underlying file system. The MAC OS, combined with Apple’s proprietary file system HFS+, lends itself to cutting edge research frontiers due to the complex nature of the file system. Starting in 2006, Apple started using EFI on its systems replacing the previous standard Open Firmware on its PowerPC based systems. This brought about some changes including the creation of an EFI system partition.

The objective of this tutorial is to ensure that participants leave the room with better understanding of MAC OS, HFS+ file system, and the forensically important artifacts left behind on a MAC computer.

Following are the main topics covered in the tutorial:

  • Overview of MAC OS
  • Open Firmware and Extensible Firmware Interface
  • HFS+ file system
  • Forensic artifacts
    • System generated artifacts
    • User generated artifacts
    • Application generated artifacts
  • Areas of interest for the examiner
    • PLIST files
    • Trash (.Trash and .Trashes)
    • Quick Look and Spot Light
    • Browser and Chat Logs
    • Email file
    • iTunes
    • Virtualization- Boot Camp and Parallels
  • Forensic examiners’ goldmine
    • Unallocated Space
    • Sleep image and Swap file
    • Slack Space
Dr. Avinash Srinivasan earned his B.E. (Industrial Production, 1999) from University of Mysore with Honors and M.S. (Computer Science, 2003) from Pace University, New York with Outstanding Achievement Award for Academic Excellence. He received Ph.D. in Computer Science from Florida Atlantic University in Aug. 2008 under the distinguished guidance of Prof. Jie Wu. During his doctoral studies, he was awarded two highly competitive scholarships - Graduate Fellowship for Academic Excellence and Dr. Daniel B. and Aural B. Newell Doctoral Fellowship.He joined Bloomsburg University of Pennsylvania as an Assistant Professor of Computer Forensics in Aug. 2008. Dr. Srinivasan has been actively engaged in research on network security and forensics, forensic analysis of file systems, Macintosh forensics, reputation and trust-based security models for wireless and sensor networks, and forensic analysis of XBox-FatX file system.

Dr. Srinivasan has published more than 20 papers in the area of security in wireless sensor networks in top conferences including INFOCOM and ACM SAC. He has also served on technical program committees of over fifty international conferences and workshops. Dr. Srinivasan is currently serving as an Associate Editor for Wiley's International Journal of Security and Communication Networks and Hex- online student journal. He is also serving as the guest associate editor for IEICE Transactions on Information and Systems. Dr. Srinivasan has served as a review panelist for US Department of Energy, Office of Science Graduate Fellowship Program -Computer Science Panel. He has served as publicity chair/co-chair for over 7 international conferences and workshops. Dr. Srinivasan has reviewed numerous book chapters and journal manuscripts including ACM/Springer WINET, Elsevier COMNET, ACM/Springer PPNA, ACM/Springer MONET, IEEE TWC, IEEE TPDS, IEEE TMC, IJPCC, IJSN, JPDC, IJIPT, and JCST. Dr. Srinivasan has over 350 hours of Digital Forensics training including Network Forensics, Windows Registry Forensics, Macintosh Forensics, FAT and NTFS file systems forensics. He has been featured in Marques "Who's Who in America-2010", Marques "Who's Who in Science and Engineering-2011", and Marques "Who's Who in The World-2011".

Tutorial 2

Network Forensics: Introduction to Retrospective Network Analysis
Bhadran V K, Director, Resource Centre for Cyber Forensics
Centre for Development of Advanced Computing, Trivandrum, India

Network forensics is the capture, recording, and analysis of network events in order to discover the source of security attacks or other problem incidents occurring in a network. In order to analyze a cyber crime network forensics experts need to have expertise in various fields like malware analysis, memory analysis, IP trace techniques and should be well aware about the different kinds of attacks on the networks. Retrospective network Analysis allows one to go “back in time” to reconstruct a failure or attack to identify what happened and trace back to the originator of the crime. Traditional real time packet capture and analysis helps in analyzing the protocol and traffic while RNA acts like a 24/7 monitoring of the network for all events just like a surveillance camera.

This tutorial gives the participants

  • An Introduction to Network Forensics
    • Procedure
    • Live Forensics
    • Memory Analysis
    • Traceback
  • Understanding Network Protocols
  • Packet Capture Techniques
  • Packet Analysis techniques
  • Discussion on various RNA tools
Bhadran V K is Director, Resource Centre for Cyber Forensics at the Centre for Development of Advanced Computing, an autonomous institution of the Indian government undertaking application oriented research in Electronics and ICT. The Centre has developed various kinds of Cyber Forensic Tools for disk forensics, network forensics and device forensics. Bhadran has been pivotal in establishing the Resource Centre for Cyber Forensics and is currently leading the development activities in network forensics and working on developing an Enterprise Forensics System with advanced capabilities for policy based monitoring and mitigation of malicious activities emanating inside the organization and external threats based on a layered approach "TEAMS - Transparent Enterprise Activity Monitoring Solution". He has more than 25 years of experience in the field of ICT and a strong background in Artificial Intelligence areas such as Expert Systems, Intelligent Tutoring Systems, Natural Language Processing Systems, Machine Translation and Robotics. He is faculty for Network Forensics at various training programs for law enforcement, defense, and corporate organizations. He has spoken at various seminars across India on network forensics and network security and is a regular guest speaker on Cyber Crimes, Network Forensic and Network Security at various Engineering Colleges including the Military College of Telecommunication Engineering, MHOW, India. Bhadran has also participated in other international network forensic workshops where he trained in South East Asian countries, and Mauritius. Bhadran has also acquired training in Information Security, Incident Handling and Cyber Forensics at the CERT-Coordination Centre, Carnegie Mellon University, USA and Hacker Techniques and Exploits by the SANS Institute. Bhadran is also the recipient of the Dr. Vasudev Award for the year 2002 for physical sciences constituted by the State Committee on Science, Technology and Environment, Govt. of Kerala. Recently Bhadran completed Advanced Level Training on Information Security at Software Engineering Institute, CERT, USA and presented a tutorial on Network Forensics at International Conference on Digital Forensics and Cyber Crimes, Abudhabi.



Tutorial 3

A Tutorial on Next-Generation Cloud Computing
Dr. Pethuru Raj , Lead Architect , Corporate Research (CR)
Robert Bosch Engineering and Business Solutions (RBEI), Bangalore, India


The much-hyped and hoped cloud paradigm is seeing an unprecedented adoption and adaption across the globe. The path-breaking cloud paradigm and philosophy is actually a smart and sensible combination of several proven and promising technologies such as consolidation, virtualization, optimization, automation, service orientation (SO), and an array of computing paradigms such as cluster, grid, on-demand, autonomic and utility computing. There are a growing array of competent automation and management tools for empowering, monitoring, managing, and maintaining cloud infrastructures and resources. Primarily auto-provisioning of cloud resources, job scheduling, workload management, virtual machine (VM) creation and control, etc. are the vital tasks getting compactly automated resulting in a tremendous reduction in IT complexity. In a typical cloud environment, real-time elasticity of resources and scalability of services and applications are also guaranteed. All these portend that the pioneering cloud idea is to stay and shine. The noteworthy aspect is that cloud infrastructures fulfil myriad quality attributes (Non-functional requirements such as IT scalability / elasticity, availability, affordability, adaptability, alacrity, sustainability, consumability, high performance etc.). The cloud paradigm has come as a boon and blessing for enterprise IT as it could introduce and incorporate a series of innovations and improvisations there in order to realize the dream of business-IT alignment.

The Emerging Trends in the Cloud Space - The extraordinary success of cloud computing in the enterprise space is being spectacularly leveraged and replicated in the vast and varied embedded space. All kinds of physical and embedded devices are being connected and linked up with clouds inducing a kind of deeper and deft connectivity among disparate, distributed, and decentralised devices enabling elegant and exotic situation-aware and people-centric services and applications. Cloud, being such a disruptive and transformative technology, is bound to raise a storm of advancements and accomplishments across a variety of domains in the days to unfold. .In other words, the cloud space is all set to join as the third major force along with the enterprise and web spaces in accurately understanding peoples’ needs, conceiving, conceptualizing and concretizing the identified requirements in the form of services and applications that can be delivered unobtrusively to the right people at the right time at the right place. Cloud is bringing the much-needed transition from the expensive IT to elastic, elegant and finally exotic IT. The cloud space is seeing much more value and verve as entrepreneurs, employees and executives are on the knowhow of the interruptive nature of cloud technology.

Next-Generation Cloud-based Services - All kinds of enterprise services and applications are being modernized, migrated and managed in consolidated, converged, dynamic, elastic and adaptive cloud infrastructures and platforms. This induces and inspires the need for competent, dynamic, and versatile cloud brokers (A kind of middleware for connecting, integrating and composing people-centric and context-aware cloud services) and brokerage services firms. As per the Gartner’s latest market research and analysis report on the cloud computing, there is a huge market out there for cloud brokerage services. Novel services and applications are being built by individuals, innovators and institutions with the solitary goal of supplying them to the world from clouds. As the visibility, agility, availability and acceptability of cloud services, platforms and infrastructures are becoming prominent and dominant, there is a new group of companies and corporations emerging and establishing to act as connectors, brokers, mediators, arbitrators, and decision-makers of a variety of cloud resources.

In this tutorial talk, I would like to focus on the following topics
  • The Enabling Technologies of Cloud Computing
  • The Principal Drivers of Cloud Computing
  • Emerging Deployment, Delivery and Consumption Models
  • Newer and Nimbler Cloud Types
  • The Security, Visibility, Controllability, Interoperability and the Portability Issues
  • The Emergence and Significance of Cloud Brokerage Services
  • The Cloud Integration & Composition Scenarios
  • The Cloud Middleware
  • The Intermediation Services
  • The Arbitration and Aggregation Services
Dr. Pethuru Raj  has been working as a Lead Architect in Robert Bosch Engineering and Business Solutions (RBEI), Bangalore since 2009.  Before that, he, as a senior consultant in Wipro Technologies, Bangalore for three years, focused in a few emerging and evolving technologies such as SOA and Cloud Computing. He was selected as a JSPS Research Fellow and again as a JST Research Scientist to focus on promising and potential IT technologies in two leading Japanese universities for 3 years. Totally Pethuru Raj has 8 years of solid industry experience in business integration technologies and tools. Also he had a very fruitful experience and expertise on enterprise-scale and distributed systems development and maintenance standards and packages such as enterprise Java (JEE) and Microsoft .NET framework. Currently his research areas include embedded SOA for service-based device-to-device (D2D) integration, Cloud technology for device-to-cloud (D2C) and sensor-to-cloud (S2C) integration, the Internet of Things (IoT) for realizing and sustaining a bevy of smarter environments, and IoT-enabling technologies such as RFID and smart sensor networks. 

He finished his PhD degree from Anna University, Chennai and worked as an UGC-sponsored research associate (RA) in the department of computer science and automation (CSA), Indian Institute of Science, Bangalore. He has been contributing well-recognized book chapters in BPM, SOA, Cloud Computing, Ubiquitous Computing, Ambient Intelligence (AmI), etc. Four books being edited by internationally acclaimed professors. Finally on his own, he has been writing a comprehensive book on the Internet of Things and its impacts on the human society. He has a personal site at www.peterindia.net 

Call for Tutorials

We invite the submission of proposals for tutorials on various areas of computing and communications at ACC2011. All proposals should be submitted by electronic mail as a .doc attachment. Tutorial proposals should be submitted to acc2011.rset@gmail.com, no later than February 04, 2011 and acceptance will be notified by February 15, 2011.


Each Tutorial proposal must include

• Title of the tutorial

• Name, title, affiliation, address, and a short CV (up to 200 words) of the instructor(s)

• A short summary (up to 1 page) of the tutorial proposal emphasizing its timeliness.

• Preferred length of tutorial (half-day/full-day, though half-day tutorial is more welcome)

• If appropriate, a description of past tutorials, including number of attendees, etc.


©RSOFT, Rajagiri School of Engineering & Technology
Rajagiri Valley, Kakkanad, Kochi, Kerala, India.