Full text




In the evolving world the role of information in defense is greater than ever before. How

fast and reliably actionable information is received by decision makers in time determines the

success of any defense action. This ideal situation is hampered by big volumes of data being

available in different data stores and on different platforms. This makes it very slow to collect

relevant data from each platform and convert it into actionable information for the defense

leaders resulting in information reaching the decision maker when it’s obsolete.

Several technological solutions that can resolve this problem exist in the market. Key

among them is cloud computing and new virtualization technologies which allow access of data

from multiple data stores without actually moving the data. This enhances the speed of data

transfer ensuring relevant information is received where needed in a timely manner for decision

making. However, most defense agencies are lagging behind in the uptake of these new

technologies. One of the key reasons behind this tendency is the procurement rules required for

government agencies. General risk averseness of defense agencies also hinders fast uptake of

new technology as a new technology has to go a multitude of tests and trials before the military

can consider it.

In so doing, the defense misses out on critical benefits that virtualization brings. One of

the main benefits of virtualization in managing big data is the creation of self-service in access of

information saving both time and money. These technologies enable the integration of data from

various sources, sorting and analysis of the big data and conversion into relevant information

depending on the request made. This ensures increased efficiency and effectiveness in

information sharing within the defense sector increasing the success of both military operations




In today’s world where terrorism is on the rise, the importance of accurate information

being delivered on time to military leaders for decision making can never be overstated. Time lag

can make the difference between a foiled terrorist attack and a repeat of 911. As such all

government agencies should work together to ensure that our military has the best and most

effective technologies to enhance big data processing and sharing of the subsequent information

in a timely manner.

To achieve this, all hurdles to realizing this goal should be systematically eliminated to

ensure our defense department remains ahead of our enemies as far as information is concerned.

In this regard it would be prudent for the military to get certain exemptions as far as government

agencies’ procurement rules to enhance their uptake of these emerging technologies in the field

of virtualization. Another critical factor in winning this war is the training of information

technology experts within the defense so as to create a culture more suited to the new

technologies. This ensures the people and the technology work together giving the best results


As far as the risk averseness is concerned I think it is a very delicate balancing issue. On

one hand, the military needs to be extremely cautious about the new technologies to avoid

instances of security lapses. On the other hand, too slow an uptake of technology gives our

enemies an edge over us in information utilization exposing us to serious threats. In this regard

the military should work with the developers or even develop similar technologies in house

ensuring that the virtualization technologies are secure enough. This entails the access controls

especially where sensitive military data is concerned. In working with the developers to




Military operations include highly classified information, is virtualization really safe as

far as unauthorized access is concerned?

Just how prone is virtualization to hacking compared to the legacy methods of big data


What relationship would exist between the developer and the military were the military to





Download now (3 pages)
Related subjects : Big data processing big data