Archive: 2018-06

When the rate of change inside an institution becomes slower than the rate of change outside, the end is in sight. - Jack Welch

In a world of agile everything, agile concepts are being applied in areas well beyond software development. At the NDIA Agile in Government Summit held in Washington, D.C. in June, Dr. George Duchak, the Deputy Assistant Secretary of Defense for Cyber, Command & Control, Communications & Networks, and Business Systems, spoke about the importance of agility to organizational success in a volatile, uncertain, complex, and ambiguous world. Dr. Duchak told the crowd that agile software development can't stand alone, but must be woven into the fabric of an organization and become a part of the way an organization's people, processes, systems and data interact to deliver value. The business architecture must be constructed for agility.

I first wrote about agile strategic planning in my March 2012 blog post, Toward Agile Strategic Planning. In this post, I want to expand that discussion to look closer at agile strategy, or short-cycle strategy development and execution, describe what it looks like when implemented, and examine how it supports organizational performance.

Part one of this series of blog posts on the collection and analysis of malware and storage of malware-related data in enterprise systems reviewed practices for collecting malware, storing it, and storing data about it. This second post in the series discusses practices for preparing malware data for analysis and discuss issues related to messaging between big data framework components.

Citing the need to provide a technical advantage to the warfighter, the Department of Defense (DoD) has recently made the adoption of cloud computing technologies a priority. Infrastructure as code (IaC), the process and technology of managing and provisioning computers and networks (physical and/or virtual) through scripts, is a key enabler for efficient migration of legacy systems to the cloud. This blog post details research aimed at developing technology to help software sustainment organizations automatically recover the deployment baseline and create the needed IaC artifacts with minimal manual intervention and no specialized knowledge about the design of the deployed system. This project will develop technology to automatically recover a deployment model from a running system and create IaC artifacts from that model.

The growth of big data has affected many fields, including malware analysis. Increased computational power and storage capacities have made it possible for big-data processing systems to handle the increased volume of data being collected. In addition to collecting the malware, new ways of analyzing and visualizing malware have been developed. In this blog post--the first in a series on using a big-data framework for malware collection and analysis--I will review various options and tradeoffs for dealing with malware collection and storage at scale.