User Tools

Site Tools


computing

Computing

Return to Computers, 4-bit computing, 8-bit computing, 16-bit computing, 24-bit computing, 32-bit computing, 64-bit computing, 72-bit computing, 128-bit computing, 256-bit computing, 512-bit computing, 1024-bit computing

Snippet from Wikipedia: Computing

Computing is any goal-oriented activity requiring, benefiting from, or creating computing machinery. It includes the study and experimentation of algorithmic processes, and the development of both hardware and software. Computing has scientific, engineering, mathematical, technological, and social aspects. Major computing disciplines include computer engineering, computer science, cybersecurity, data science, information systems, information technology, and software engineering.

The term computing is also synonymous with counting and calculating. In earlier times, it was used in reference to the action performed by mechanical computing machines, and before that, to human computers.

Short description: Activity that uses computers

Computing is any activity that uses computers to manage, process, and communicate information. It includes development of both hardware and software. Computing has become a critical, integral component of modern industrial technology. Major computing disciplines include computer engineering, computer science, cybersecurity, data science, information systems, information technology and software engineering

Definitions

The ACM Computing Curricula 2005<ref name=curricula>

</ref> and 2020<ref name=“:0” /> defined “computing” as follows:

ACM also defines seven sub-disciplines of the computing field:<ref name=“:0” />

However, Computing Curricula 2005<ref name=curricula/> also recognizes that the meaning of “computing” depends on the context:

The term “computing” has sometimes been narrowly defined, as in a 1989 ACM report on Computing as a Discipline:<ref>

</ref>

The term “computing” is also synonymous with counting and calculating. In earlier times, it was used in reference to the action performed by mechanical computing machines, and before that, to human computers.<ref>

</ref>

History

The history of computing is longer than the history of computing hardware and modern computing technology and includes the history of methods intended for pen and paper or for chalk and slate, with or without the aid of tables.

Computing is intimately tied to the representation of numbers. But long before abstractions like the number arose, there were mathematical concepts to serve the purposes of civilization.

These concepts include one-to-one correspondence (the basis of counting), comparison to a standard (used for measurement), and the 3-4-5 right triangle (a device for assuring a right angle).

The earliest known tool for use in computation was the abacus, and it was thought to have been invented in Babylon circa 2400 BC. Its original style of usage was by lines drawn in sand with pebbles. Abaci, of a more modern design, are still used as calculation tools today. This was the first known calculation aid – preceding Greek methods by 2,000 years

.

The first recorded idea of using digital electronics for computing was the 1931 paper “The Use of Thyratrons for High Speed Automatic Counting of Physical Phenomena” by C. E. Wynn-Williams.<ref>

</ref> Claude Shannon's 1938 paper “A Symbolic Analysis of Relay and Switching Circuits” then introduced the idea of using electronics for Boolean algebraic operations.

The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first working transistor, the point-contact transistor, in 1947.<ref name=“Lee”>

</ref><ref name=“Puers”>

</ref> In 1953, the University of Manchester built the first transistorized computer, called the Transistor Computer.<ref>

</ref> However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialised applications.<ref name=“Moskowitz”>

</ref> The metal–oxide–silicon field-effect transistor (MOSFET, or MOS transistor) was invented by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959.<ref name=“computerhistory”>

</ref><ref name=“Lojek”>

</ref> It was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses.<ref name=“Moskowitz”/> The MOSFET made it possible to build high-density integrated circuit chips,<ref name=“computerhistory-transistor”>

</ref><ref name=“Hittinger”>

</ref> leading to what is known as the computer revolution<ref>

</ref> or microcomputer revolution.<ref>

</ref>

Computer

A computer is a machine that manipulates data according to a set of instructions called a computer program. The program has an executable form that the computer can use directly to execute the instructions. The same program in its human-readable source code form, enables a programmer to study and develop a sequence of steps known as an algorithm. Because the instructions can be carried out in different types of computers, a single set of source instructions converts to machine instructions according to the CPU type.

The execution process carries out the instructions in a computer program. Instructions express the computations performed by the computer. They trigger sequences of simple actions on the executing machine. Those actions produce effects according to the semantics of the instructions.

Computer software and hardware

Computer software, or just “software”, is a collection of computer programs and related data that provides the instructions for telling a computer what to do and how to do it. Software refers to one or more computer programs and data held in the storage of the computer for some purposes. In other words, software is a set of programs, procedures, algorithms and its documentation concerned with the operation of a data processing system. Program software performs the function of the program it implements, either by directly providing instructions to the computer hardware or by serving as input to another piece of software. The term was coined to contrast with the old term hardware (meaning physical devices). In contrast to hardware, software is intangible.<ref>

</ref> Software is also sometimes used in a more narrow sense, meaning application software only.

Application software

Application software, also known as an “application” or an “app”, is a computer software designed to help the user to perform specific tasks. Examples include enterprise software, accounting software, office suites, graphics software and media players. Many application programs deal principally with documents. Apps may be bundled with the computer and its system software, or may be published separately. Some users are satisfied with the bundled apps and need never install additional applications.

Application software is contrasted with system software and middleware, which manage and integrate a computer's capabilities, but typically do not directly apply them in the performance of tasks that benefit the user. The system software serves the application, which in turn serves the user.

Application software applies the power of a particular computing platform or system software to a particular purpose. Some apps such as Microsoft Office are available in versions for several different platforms; others have narrower requirements and are thus called, for example, a Geography application for Windows or an Android application for education or Linux gaming. Sometimes a new and popular application arises that only runs on one platform, increasing the desirability of that platform. This is called a killer application.

System software

System software, or systems software, is computer software designed to operate and control the computer hardware, and to provide a platform for running application software. System software includes operating systems, utility software, device drivers, window systems, and firmware. Frequently used development tools such as compilers, linkers, and debuggers<ref>

</ref> are classified as system software.

Computer network

A computer network, often simply referred to as a network, is a collection of hardware components and computers interconnected by communication channels that allow sharing of resources and information.<ref>

</ref> Where at least one process in one device is able to send/receive data to/from at least one process residing in a remote device, then the two devices are said to be in a network.

Networks may be classified according to a wide variety of characteristics such as the medium used to transport the data, communications protocol used, scale, topology, and organizational scope.

Communications protocols define the rules and data formats for exchanging information in a computer network, and provide the basis for network programming. Well-known communications protocols include Ethernet, a hardware and Link Layer standard that is ubiquitous in local area networks, and the Internet Protocol Suite, which defines a set of protocols for internetworking, i.e. for data communication between multiple networks, as well as host-to-host data transfer, and application-specific data transmission formats.

Computer networking is sometimes considered a sub-discipline of electrical engineering, telecommunications, computer science, information technology or computer engineering, since it relies upon the theoretical and practical application of these disciplines.

Internet

The Internet is a global system of interconnected computer networks that use the standard Internet protocol suite (TCP/IP) to serve billions of users that consists of millions of private, public, academic, business, and government networks, of local to global scope, that are linked by a broad array of electronic, wireless and optical networking technologies. The Internet carries an extensive range of information resources and services, such as the inter-linked hypertext documents of the World Wide Web and the infrastructure to support email.

Computer programming

Computer programming in general is the process of writing, testing, debugging, and maintaining the source code and documentation of computer programs. This source code is written in a programming language, which is an artificial language often more restrictive or demanding than natural languages, but easily translated by the computer. The purpose of programming is to invoke the desired behavior (customization) from the machine. The process of writing high quality source code requires knowledge of both the application's domain and the computer science domain. The highest-quality software is thus developed by a team of various domain experts, each person a specialist in some area of development. But the term programmer may apply to a range of program quality, from hacker to open source contributor to professional. And a single programmer could do most or all of the computer programming needed to generate the proof of concept to launch a new "killer" application.

Computer programmer

A programmer, computer programmer, or coder is a person who writes computer software. The term computer programmer can refer to a specialist in one area of computer programming or to a generalist who writes code for many kinds of software. One who practices or professes a formal approach to programming may also be known as a programmer analyst. A programmer's primary computer language (C, C++, Java, Lisp, Python, etc.) is often prefixed to the above titles, and those who work in a web environment often prefix their titles with web. The term programmer can be used to refer to a software developer, software engineer, computer scientist, or software analyst. However, members of these professions typically<ref>

</ref> possess other software engineering skills, beyond programming.

Computer industry

The computer industry is made up of all of the businesses involved in developing computer software, designing computer hardware and computer networking infrastructures, the manufacture of computer components and the provision of information technology services including system administration and maintenance.

Software industry

The software industry includes businesses engaged in development, maintenance and publication of software. The industry also includes software services, such as training, documentation, and consulting.

{{Anchor|DISCIPLINES}}Sub-disciplines of computing

Computer engineering

Computer engineering is a discipline that integrates several fields of electrical engineering and computer science required to develop computer hardware and software.<ref>

</ref> Computer engineers usually have training in electronic engineering (or electrical engineering), software design, and hardware-software integration instead of only software engineering or electronic engineering. Computer engineers are involved in many hardware and software aspects of computing, from the design of individual microprocessors, personal computers, and supercomputers, to circuit design. This field of engineering not only focuses on the design of hardware within its own domain, but as well the interactions between hardware and the world around it.<ref>

, “Computer engineers need not only to understand how computer systems themselves work, but also how they integrate into the larger picture. Consider the car. A modern car contains many separate computer systems for controlling such things as the engine timing, the brakes and the air bags. To be able to design and implement such a car, the computer engineer needs a broad theoretical understanding of all these various subsystems & how they interact.</ref>

Software engineering

Software engineering (SE) is the application of a systematic, disciplined, quantifiable approach to the design, development, operation, and maintenance of software, and the study of these approaches; that is, the application of engineering to software.<ref name=“BoDu04”>

</ref><ref>

</ref><ref>

</ref> In layman's terms, it is the act of using insights to conceive, model and scale a solution to a problem. The first reference to the term is the 1968 NATO Software Engineering Conference and was meant to provoke thought regarding the perceived ”software crisis“ at the time.<ref>

</ref><ref>

</ref><ref>

</ref> Software development, a much used and more generic term, does not necessarily subsume the engineering paradigm. The generally accepted concepts of Software Engineering as an engineering discipline have been specified in the Guide to the Software Engineering Body of Knowledge (SWEBOK). The SWEBOK has become an internationally accepted standard ISO/IEC TR 19759:2015.<ref>

</ref>

Computer science

Computer science or computing science (abbreviated CS or Comp Sci) is the scientific and practical approach to computation and its applications. A computer scientist specializes in the theory of computation and the design of computational systems.<ref>

</ref>

Its subfields can be divided into practical techniques for its implementation and application in computer systems and purely theoretical areas. Some, such as computational complexity theory, which studies fundamental properties of computational problems, are highly abstract, while others, such as computer graphics, emphasize real-world applications. Still others focus on the challenges in implementing computations. For example, programming language theory studies approaches to description of computations, while the study of computer programming itself investigates various aspects of the use of programming languages and complex systems, and human–computer interaction focuses on the challenges in making computers and computations useful, usable, and universally accessible to humans.

Information systems

“Information systems (IS)” is the study of complementary networks of hardware and software (see information technology) that people and organizations use to collect, filter, process, create, and distribute data.<ref>

</ref><ref>

</ref><ref>

</ref><ref>

</ref><ref>

</ref> The ACM's Computing Careers website says

</ref> }}

The study bridges business and computer science using the theoretical foundations of information and computation to study various business models and related algorithmic processes within a computer science discipline.<ref>

</ref><ref>

</ref><ref>

</ref><ref>

</ref><ref>

</ref><ref>

</ref><ref>

</ref><ref>

</ref><ref name=“Kelly 1999 1–27”>

</ref><ref>

</ref>

Computer Information System(s) (CIS)

This field studies computers and algorithmic processes, including their principles, their software and hardware designs, their applications, and their impact on society<ref>

</ref><ref>

</ref><ref>

</ref> while IS emphasizes functionality over design.<ref name=Freeman_Hart_2004>

</ref>

Information technology

Information technology (IT) is the application of computers and telecommunications equipment to store, retrieve, transmit and manipulate data,<ref name=“DOP”>

</ref> often in the context of a business or other enterprise.<ref>

</ref> The term is commonly used as a synonym for computers and computer networks, but it also encompasses other information distribution technologies such as television and telephones. Several industries are associated with information technology, such as computer hardware, software, electronics, semiconductors, internet, telecom equipment, e-commerce and computer services.<ref name=“DMC”>

</ref><ref name=“Ralston2000”>On the later more broad application of the term IT, Keary comments- “In its original application 'information technology' was appropriate to describe the convergence of technologies with application in the broad field of data storage, retrieval, processing, and dissemination. This useful conceptual term has since been converted to what purports to be concrete use, but without the reinforcement of definition…the term IT lacks substance when applied to the name of any function, discipline, or position.”

.</ref>

Systems administration

A system administrator, IT systems administrator, systems administrator, or sysadmin is a person employed to maintain and operate a computer system or network. The duties of a system administrator are wide-ranging, and may vary substantially from one organization to another. Sysadmins are usually charged with installing, supporting and maintaining servers or other computer systems, and planning for and responding to service outages and other problems. Other duties may include scripting or light programming, project management for systems-related projects, supervising or training computer operators, and being the consultant for computer problems beyond the knowledge of technical support staff.

Research and emerging technologies

DNA-based computing and quantum computing are areas of active research in both hardware and software (such as the development of quantum algorithms). Potential infrastructure for future technologies includes DNA origami on photolithography<ref>

supplementary information: DNA origami on photolithography</ref> and quantum antennae for transferring information between ion traps.<ref>

</ref> By 2011, researchers had entangled 14 qubits.<ref>

</ref><ref>

</ref> Fast digital circuits (including those based on Josephson junctions and rapid single flux quantum technology) are becoming more nearly realizable with the discovery of nanoscale superconductors.<ref>Saw-Wai Hla et al., Nature Nanotechnology March 31, 2010 "World's smallest superconductor discovered"

. Four pairs of certain molecules have been shown to form a nanoscale superconductor, at a dimension of 0.87 nanometers. Access date 2010-03-31</ref>

Fiber-optic and photonic (optical) devices, which already have been used to transport data over long distances, have started being used by data centers, side by side with CPU and semiconductor memory components. This allows the separation of RAM from CPU by optical interconnects.<ref>Tom Simonite, "Computing at the speed of light", ''Technology Review'' Wed., August 4, 2010 MIT</ref> IBM has created an integrated circuit with both electronic and optical information processing in one chip. This is denoted “CMOS-integrated nanophotonics” or (CINP).<ref>Sebastian Anthony (Dec 10,2012), "IBM creates first commercially viable silicon nanophotonic chip", accessdate=2012-12-10</ref> One benefit of optical interconnects is that motherboards which formerly required a certain kind of system on a chip (SoC) can now move formerly dedicated memory and network controllers off the motherboards, spreading the controllers out onto the rack. This allows standardization of backplane interconnects and motherboards for multiple types of SoCs, which allows more timely upgrades of CPUs.<ref>Open Compute: Does the data center have an open future? accessdate=2013-08-11</ref>

Another field of research is spintronics. Spintronics can provide computing power and storage, without heat buildup.<ref>Putting electronics in a spin</ref> Some research is being done on hybrid chips, which combine photonics and spintronics.<ref>

</ref><ref>Integrating all-optical switching with spintronics</ref> There is also research ongoing on combining plasmonics, photonics, and electronics.<ref>Plasmonic nanogap enhanced phase-change devices with dual electrical-optical functionality</ref>

Cloud Computing

Cloud computing is a model that allows for the use of computing resources, such as servers or applications, without the need for much interaction between the owner of these resources and the user using them. It is typically offered as a service, making it another example of Software as a Service, Platforms as a Service, and Infrastructure as a Service depending on the functionality offered. Key characteristics include on-demand access, broad network access, and the capability of rapid scaling.<ref>

</ref> It allows individual users or small business to benefit from economies of scale.

One area of interest in this field is its potential to support energy efficiency. Allowing thousands of instances of computation to occur on one single machine instead of thousands of individual machines could help save energy. It could also ease the transition to more renewable energy, since it would suffice to power one server farm with a set of solar panels or wind turbines rather than millions of peoples' homes.<ref>

</ref>

With centralized computing, the field poses several challenges, especially in security and privacy. Current legislation does not sufficiently protect users from companies mishandling their data on the company servers. This suggests potential for further legislative regulations on cloud computing and tech companies.<ref>

</ref>

Quantum Computing

Quantum computing is an area of research that brings together the disciplines of computer science, information theory, and quantum physics. The idea of information being a basic part of physics is relatively new, but there seems to be a strong tie between information theory and quantum mechanics.<ref>

</ref> Whereas traditional computing operates on a binary system of ones and zeros, quantum computing uses qubits. Qubits are capable of being in a superposition, which means that they are in both states, one and zero, simultaneously. This means the qubit is not somewhere between 1 and 0, but actually the value of the qubit will change depending on when you measure it. This trait of qubits is called quantum entanglement and is the core idea of quantum computing and is what allows quantum computers to do the large scale equations they are used for.<ref>

</ref> Quantum computing is often used for scientific research where a normal computer does not have nearly enough computational power to do the calculations necessary. A good example would be molecular modeling. Large molecules are far too complex for modern computers to calculate what happens to them during a reaction, but the power of quantum computers could open the doors to further understanding these molecules.

See also

computing.txt · Last modified: 2022/09/13 07:47 by 127.0.0.1