December 8, 2020

615 words 3 mins read

The Evolution Of IT

The Evolution Of IT

Information Technology or in mundane terms, IT, is a range of technologies that includes, but is not  limited to hardware, software, communication technologies that create, process, store, secure and  exchange data. But like any other human advancement, Modern IT has taken eons to attain its present height with  each product, era, or concept gaining ephemeral importance as these are soon to be replaced by  newer, more advanced technology.

IT has developed over the course of nearly six decades, but the  first traces of the presence of computing devices have been dated back to 500 BCE, viz., abacus. But  the Z1, created in 1936 by Konrad Zuse, is considered to be the first functional and programmable  modern computer. This electromechanical computer pre-exists our modern IT. The first electronic computer, ENIAC (Electronic Numerical Integrator and Computer), was a  programmable and a general-purpose computer used to calculate artillery firing tables for the US  Army. Its successor, EDVAC (Electronic Discrete Variable Automatic Computer), one of the earliest  electronic computers was designed to be a stored-program computer which was binary. It used a  direct-access architecture and had no operating systems. The lack of software applications reduced  its use to only scientific computing.  

The 1960s brought the era of mainframe computers, which is high-performance computers with  large amounts of memory and processors that process billions of simple calculations and  transactions in real time. The IBM 360 was the world’s first mainframe computer. These mainframes  were highly centralized computers. But they were soon followed by decentralized computing, which  was present in Minicomputers, although the first personal computer was only made available in  1975 – Altair 800, a hit amongst the computer enthusiasts and entrepreneurs. Even then, it wasn’t  the ideal minicomputer as it needed extensive assembly, rendering its ease-of-use.  The decade of the 1980s saw the introduction of the Personal Computer is also known as PC. The IBM  PC is one of the earliest PCs that was used by enterprises and for personal use. It made use of the  operating system, Microsoft, which, at that time, was in the primal stages of its fame. The IBM PC  provided a spectrum of software tools such as word processing and spreadsheets. But even then,  due to lack of software for networking and the required hardware, this PC was only limited to be a  standalone desktop system, devoid of connectivity.  

By the end of the 90s, there was an extensive request for internet connectivity between the business  because of limitations in web access and productivity. This new sense of the vast connectivity between  the people and the network of computers elevated the amount of data being generated and overall  helped in the process of data analysis and encapsulating it into useful information. This was also an  era that saw a boom in the number of IT vendors such as Intel, Oracle, Microsoft, Cisco, EMC and  Dell. But the 2000s, World Wide Web (WWW) enabled IT to be available globally, whereas computer  networks made IT access throughout an entire enterprise. WWW made new applications  available to the common masses and digitized their daily lives. Limitations of sharing information  with anyone had decreased due to the access to the internet, and this led to the upsurge in the amount of data that was created, stored, shared and consumed.  

In today’s time, mobile and cloud computing has enabled convenient and on-demand access to a  pool of computing resources such as apps, networks, servers, storage and services. These devices  have minimal management efforts and quick establishment. Today IT has become an integral part of any business along with becoming necessary in people’s day  to day lives. The daily advancement in this field is adding to its significance and prominence.

comments powered by Disqus