When of the following technologies would be MOST appropriate to utilize when testing a new software
patch before a company-wide deplyment?
A.
Cloud Computing
B.
Virtualization
C.
Redundancy
D.
Application control
Explanation:
Virtualization is used to host one or more operating systems in the memory of a single host computer and
allows multiple operating systems to run simultaneously on the same hardware, reducing costs.
Virtualization offers the flexibility of quickly and easily making backups of entire virtual systems, and
quickly recovering the virtual system when errors occur. Furthermore, malicious code compromises of
virtual systems rarely affect the host system, which allows for safer testing and experimentation.
not sure I agree. Cloud computing is plausible for this but so is virtualization. If I have a software patch and I want to test it, ideally, I could test it on a virtualized world. Worse case scenario, I have to reload the files.
If I have a IaaS, I could also deploy the patch but that would be expensive. Most of the time, if you have a IaaS, then you are running production info through it.
I cannot firmly discount either but I would tend toward virtualization as most appropriate.
1
1