Information technology


www.nareshastronomy.blogspot.com


Information technology (IT) is "the study, design, development, implementation, support or management of computer-based information systems, particularly software applications and computer hardware", according to the Information Technology Association of America(ITAA). IT deals with the use of electronic computers and computer software to securely convertstoreprotectprocesstransmitinput,output, and retrieve information.
Today, the term information has ballooned to encompass many aspects of computing and technology, and the term have become very recognizable. IT professionals perform a variety of duties that range from installing applications to designing complex computer networksand information databases. A few of the duties that IT professionals perform may include data management, networking, engineeringcomputer hardware, database and software design, as well as the management and administration of entire systems. Information technology is starting to spread farther than the conventional personal computer and network technology, and more into integrations of other technologies such as the use of cell phones, televisions, automobiles, and more, which is increasing the demand for such jobs.
When computer and communications technologies are combined, the result is information technology, sometimes called "infotech."Information technology is a general term that describes any technology that helps to produce, manipulate, store, communicate, and/or disseminate information.
In recent days, ABET and the ACM have collaborated to form accreditation and curriculum standards for degrees in Information Technology as a distinct field of study separate from both Computer Science andInformation Systems. SIGITE is the ACM working group for defining these standards. The Worldwide IT services revenue totalled $763 billion in 2009.
It is important to consider the overall value chain in technology development projects as the challenge for the value creation is increasing with the growing competitiveness between organizations that has become evident (Bird, 2010). The concept of value creation through technology is heavily dependent upon the alignment of technology and business strategies. While the value creation for an organization is a network of relationships between internal and external environments, technology plays an important role in improving the overall value chain of an organization. However, this increase requires business and technology management to work as a creative, synergistic, and collaborative team instead of a purely mechanistic span of control. 



COMPUTER

 Computer is a programmable machine that receives input, stores and manipulates data//information, and provides output in a useful format.
While a computer can, in theory, be made out of almost anything (see misconceptions section), and mechanical examples of computers have existed through much of recorded human history, the first electronic computers were developed in the mid-20th century (1940–1945). Originally, they were the size of a large room, consuming as much power as several hundred modern personal computers (PCs). Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space. Simple computers are small enough to fit into mobile devices, and can be powered by a small battery. Personal computers in their various forms are icons of the Information Age and are what most people think of as "computers". However, the embedded computers found in many devices from MP3 players to fighter aircraft and from toys to industrial robots are the most numerous.



Artificial intelligence

A computers will solve problems in exactly the way they are programmed to, without regard to efficiency nor alternative solutions nor possible shortcuts nor possible errors in the code. Computer programs which learn and adapt are part of the emerging field of artificial intelligence and machine learning.

Hardware

The term hardware covers all of those parts of a computer that are tangible objects. Circuits, displays, power supplies, cables, keyboards, printers and mice are all hardware.
History of computing hardware
First Generation (Mechanical/Electromechanical)
Calculators
Programmable Devices
Second Generation (Vacuum Tubes)
Calculators
Third Generation (Discrete transistors and SSI, MSI, LSI Integrated circuits)
Fourth Generation (VLSI integrated circuits)
Minicomputer
4-bit microcomputer
8-bit microcomputer
16-bit microcomputer
32-bit microcomputer
64-bit microcomputer[33]
Theoretical/experimental
Other Hardware Topics
Input
Output
Both
Short range
Long range (Computer networking)



Software

Computer software
Unix and BSD
Experimental
Programming library
HTMLXMLJPEGMPEGPNG
Internet Access
Design and manufacturing
Computer-aided designComputer-aided manufacturing, Plant management, Robotic manufacturing, Supply chain management
Educational
Misc

Programming languages



The history of computer science began long before the modern discipline of computer science that emerged in the twentieth century, and hinted at in the centuries prior. The progression, from mechanical inventions and mathematical theories towards the modern concepts and machines, formed a major academic field and the basis of a massive worldwide industry.

Birth of computer science

Before the 1920s, computers (sometimes computors) were human clerks that performed computations. They were usually under the lead of a physicist. Many thousands of computers were employed in commerce, government, and research establishments. Most of these computers were women, and they were known to have a degree in calculus. Some performed astronomical calculations for calendars.
After the 1920s, the expression computing machine referred to any machine that performed the work of a human computer, especially those in accordance with effective methods of the Church-Turing thesis. The thesis states that a mathematical method is effective if it could be set out as a list of instructions able to be followed by a human clerk with paper and pencil, for as long as necessary, and without ingenuity or insight.
Machines that computed with continuous values became known as the analog kind. They used machinery that represented continuous numeric quantities, like the angle of a shaft rotation or difference in electrical potential.
Digital machinery, in contrast to analog, were able to render a state of a numeric value and store each individual digit. Digital machinery used difference engines or relays before the invention of faster memory devices.
The phrase computing machine gradually gave away, after the late 1940s, to just computer as the onset of electronic digital machinery became common. These computers were able to perform the calculations that were performed by the previous human clerks.
Since the values stored by digital machines were not bound to physical properties like analog devices, a logical computer, based on digital equipment, was able to do anything that could be described "purely mechanical." The theoretical Turing Machine, created by Alan Turing, is a hypothetical device theorized in order to study the properties of such hardware.