1. Modern Technologies
1.1. Top Trending Technologies
1.1.1. Artificial Intelligence
1.1.1.1. Artificial Intelligence Training – Explore the Curriculum to Master AI and Deep Learning. AI existed even before the internet was born, but it is now that the data processing and compute power backbone became strong enough to sustain an entire technology by itself. AI is everywhere today, from your smartphones to your cars to your home to your banking establishment. It is the new normal, something the world cannot do without.
1.1.2. Augmented Reality and Virtual Reality
1.1.2.1. Virtual is real! VR and AR, the twin technologies that let you experience things in virtual, that are extremely close to real, are today being used by businesses of all sizes and shapes. But the underlying technology can be quite complex. Medical students use AR technology to practice surgery in a controlled environment. VR on the other hand, opens up newer avenues for gaming and interactive marketing.
1.1.3. Cognitive Cloud Computing
1.1.3.1. AWS Solution Architect Training – Explore the Curriculum to Master AWS. Cognitive Cloud is an extended ecosystem of traditional Cloud and Cognitive Computing. It’s due to this, you can create Cognitive Computing applications and bring to the masses through cloud deployments. Cognitive computing is considered as the next big evolution in the IT industry. It converses in human language and helps experts in better decision making by understanding the complexities of Big Data. Its market size is expected to generate revenue of $13.8 billion by 2020 and is one of the top 10 trending technologies to consider this year.
1.1.4. Angular and React
1.1.4.1. Angular and React Training – Explore the Curriculum to Master Angular and React. OK, now we are getting into core tech. Angular and React are JavaScript based Frameworks for creating modern web applications. Using React and Angular one can create a highly modular web app. So, you don’t need to go through a lot of changes in your code base for adding a new feature. Angular and React also allows you to create a native mobile application with the same JS, CSS & HTML knowledge. Best part – Open source library with highly active community support.
1.1.5. DevOps
1.1.5.1. DevOps Training – Explore the Curriculum to Master DevOps tools. This is the odd one out in the list. It is not a technology, but a methodology. DevOps is a methodology that ensures that both the development and operations go hand in hand. DevOps cycle is picturized as an infinite loop representing the integration of developers and operation teams by: automating infrastructure, workflows and continuously measuring application performance.
1.1.6. RPA (Robotic Process Automation)
1.1.6.1. RPA Training – Explore the Curriculum to Master RPA. Generally, any desk job in any industry involves tasks that are repetitive in nature and can be automated. RPA or Robotic Process Automation allows you to automate such routine and repetitive tasks. You don’t need to write any code to automate repetitive tasks. In 2020, the trend of bots and machine learning is only going to skyrocket, which means RPA will become an invaluable skill to have.
2. Ancient Computing Devices
2.1. Arithmometer (1820) The Arithmometer was patented by Thomas de Colmar in 1820 and was the first successful mechanical calculator to be used in offices. From 1851 to 1890, the Arithmometer was the only type of mechanical calculator in production. The machine can perform addition and subtraction directly and can perform long multiplication and division.
2.2. The Abacus There is a long history detailing the invention of computing and calculating machines. The earliest recorded calculating device is the abacus. Used as a simple computing device for performing arithmetic, the abacus most likely appeared first in Babylonia (now Iraq) over 5000 years ago. Its more familiar form today is derived from the Chinese version pictured below. The abacus is more of a counting device than a true calculator. (See Figure 1.) Nonetheless, it was used for centuries as a reliable means for doing additions and subtractions.
2.3. Stepped Reckoner (1694) The Stepped Reckoner was invented by Gottried von Leibniz. It was the first machine to calculate all the four arithmetic operations. It was also the first machine to use a cursor and therefore has a memory for storing the first operand. Leibniz built two Stepped Reckoners, one in 1694 and one in 1706. The second one was forgotten at an attic in the University of Gottingen and was discovered 250 years later!
3. Blockchain Technology
3.1. If you have been following banking, investing, or cryptocurrency over the last ten years, you may be familiar with “blockchain,” the record-keeping technology behind the Bitcoin network. And there’s a good chance that it only makes so much sense. In trying to learn more about blockchain, you've probably encountered a definition like this: “blockchain is a distributed, decentralized, public ledger."
3.1.1. Blocks store information about transactions like the date, time, and dollar amount of your most recent purchase from Amazon. (NOTE: This Amazon example is for illustrative purchases; Amazon retail does not work on a blockchain principle as of this writing)
3.1.2. Blocks store information that distinguishes them from other blocks. Much like you and I have names to distinguish us from one another, each block stores a unique code called a “hash” that allows us to tell it apart from every other block. Hashes are cryptographic codes created by special algorithms. Let’s say you made your splurge purchase on Amazon, but while it’s in transit, you decide you just can’t resist and need a second one. Even though the details of your new transaction would look nearly identical to your earlier purchase, we can still tell the blocks apart because of their unique codes.
3.1.2.1. Blocks store information about who is participating in transactions. A block for your splurge purchase from Amazon would record your name along with Amazon.com, Inc. (AMZN). Instead of using your actual name, your purchase is recorded without any identifying information using a unique “digital signature,” sort of like a username.
4. Human-Computer Interaction
4.1. Human-computer interaction (HCI) is a multidisciplinary field of study focusing on the design of computer technology and, in particular, the interaction between humans (the users) and computers. While initially concerned with computers, HCI has since expanded to cover almost all forms of information technology design.
4.1.1. The Meteoric Rise of HCI HCI surfaced in the 1980s with the advent of personal computing, just as machines such as the Apple Macintosh, IBM PC 5150 and Commodore 64 started turning up in homes and offices in society-changing numbers. For the first time, sophisticated electronic systems were available to general consumers for uses such as word processors, games units and accounting aids. Consequently, as computers were no longer room-sized, expensive tools exclusively built for experts in specialized environments, the need to create human-computer interaction that was also easy and efficient for less experienced users became increasingly vital. From its origins, HCI would expand to incorporate multiple disciplines, such as computer science, cognitive science and human-factors engineering.
5. Operating System
5.1. An operating system is the core set of software on a device that keeps everything together. Operating systems communicate with the device's hardware. They handle everything from your keyboard and mice to the Wi-Fi radio, storage devices, and display. In other words, an operating system handles input and output devices.
5.1.1. Following are some of important functions of an operating System. Memory Management Processor Management Device Management File Management Security Control over system performance Job accounting Error detecting aids Coordination between other software and users
5.1.1.1. Memory Management Memory management refers to management of Primary Memory or Main Memory. Main memory is a large array of words or bytes where each word or byte has its own address. Main memory provides a fast storage that can be accessed directly by the CPU. For a program to be executed, it must in the main memory. An Operating System does the following activities for memory management − Keeps tracks of primary memory, i.e., what part of it are in use by whom, what part are not in use. In multiprogramming, the OS decides which process will get memory when and how much. Allocates the memory when a process requests it to do so. De-allocates the memory when a process no longer needs it or has been terminated. Processor Management In multiprogramming environment, the OS decides which process gets the processor when and for how much time. This function is called process scheduling. An Operating System does the following activities for processor management − Keeps tracks of processor and status of process. The program responsible for this task is known as traffic controller. Allocates the processor (CPU) to a process. De-allocates processor when a process is no longer required. Device Management An Operating System manages device communication via their respective drivers. It does the following activities for device management − Keeps tracks of all devices. Program responsible for this task is known as the I/O controller. Decides which process gets the device when and for how much time. Allocates the device in the efficient way. De-allocates devices. File Management A file system is normally organized into directories for easy navigation and usage. These directories may contain files and other directions. An Operating System does the following activities for file management − Keeps track of information, location, uses, status etc. The collective facilities are often known as file system. Decides who gets the resources. Allocates the resources. De-allocates the resources. Other Important Activities Following are some of the important activities that an Operating System performs − Security − By means of password and similar other techniques, it prevents unauthorized access to programs and data. Control over system performance − Recording delays between request for a service and response from the system. Job accounting − Keeping track of time and resources used by various jobs and users. Error detecting aids − Production of dumps, traces, error messages, and other debugging and error detecting aids. Coordination between other softwares and users − Coordination and assignment of compilers, interpreters, assemblers and other software to the various users of the computer systems.