Overblog Suivre ce blog
Editer l'article Administration Créer mon blog
31 décembre 2010 5 31 /12 /décembre /2010 12:31

Information technology (IT) is "the acquisition, processing, storage and dissemination of vocal, pictorial, textual and numerical information by a microelectronics-based combination of computing and telecommunications"  (Dell XPS M1210 Batteryhttp://www.hdd-shop.co.uk .

The term in its modern sense first appeared in a 1958 article published in the Harvard Business Review, in which authors Leavitt and Whisler commented that "the new technology does not yet have a single established name. We shall call it information technology  (Dell Studio XPS 1340 Battery)  ."

General information

IT spans a wide variety of areas that include but are not limited to things such as processes, computer software, computer hardware, programming languages, and data constructs  (Dell Studio XPS 1640 Battery)  .

In short, anything that renders data, information or perceived knowledge in any visual format whatsoever, via any multimedia distribution mechanism, is considered part of the domain space known as Information Technology (IT)  (Dell Vostro 1710 Battery) .

IT professionals perform a variety of functions (IT Disciplines/Competencies) that range from installing applications to designing complex computer networks and information databases. A few of the duties that IT professionals perform may include data management  (Dell KM958 battery)  ,

networking, engineering computer hardware, database and software design, as well as management and administration of entire systems. Information technology is starting to spread farther than the conventional personal computerand network technology, and more into integrations of other technologies such as the use of cell phones, televisions, automobiles, and more, which is increasing the demand for such jobs  (Sony VGP-BPS13 battery)   .

In the recent past, the Accreditation Board for Engineering and Technology and the Association for Computing Machinery have collaborated to form accreditation and curriculum standards for degrees in Information Technology as a distinct field of study as compared to Computer Science and Information Systems today  (Sony VGP-BPS13/B battery)  .

SIGITE is the ACM working group for defining these standards. The Worldwide IT services revenue totaled $763 billion in 2009.

Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E  (Sony VGP-BPS13/S battery)  .

Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data. Since its inception it has broadened to find applications in many other areas, includingstatistical inference (Sony VGP-BPS13A/B battery)  ,

natural language processing, cryptography generally, networks other than communication networks — as in neurobiology, the evolution and function of molecular codes, model selection in ecology, thermal physics, quantum computing, plagiarism detectionand other forms of data analysis  (Sony VGP-BPS13B/B battery) .

A key measure of information is known as entropy, which is usually expressed by the average number of bits needed for storage or communication. Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes)  (Sony VGP-BPL9 battery)  .

Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s), and channel coding (e.g. for DSL lines). The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, and electrical engineering  (Sony VGP-BPS13B/B battery)  .

Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields Sony VGP-BPL15 battery .

Important sub-fields of information theory are source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, and measures of information  (Dell Inspiron E1505 battery) .

Overview

The main concepts of information theory can be grasped by considering the most widespread means of human communication: language. Two important aspects of a concise language are as follows: First, the most common words (e.g., "a", "the", "I") should be shorter than less common words (e.g., "benefit", "generation", "mediocre")  (Dell Latitude E6400 battery ,

so that sentences will not be too long. Such a tradeoff in word length is analogous to data compression and is the essential aspect of source coding. Second, if part of a sentence is unheard or misheard due to noise — e.g., a passing car — the listener should still be able to glean the meaning of the underlying message  (HP Pavilion dv6000 Battery) .

Such robustness is as essential for an electronic communication system as it is for a language; properly building such robustness into communications is done by channel coding. Source coding and channel coding are the fundamental concerns of information theory.

Note that these concerns have nothing to do with the importance of messages  (SONY VAIO VGN-FZ Battery)   .

For example, a platitude such as "Thank you; come again" takes about as long to say or write as the urgent plea, "Call an ambulance!" while the latter may be more important and more meaningful in many contexts. Information theory, however, does not consider message importance or meaning   (SONY VAIO VGN-FZ18 Battery ,

as these are matters of the quality of data rather than the quantity and readability of data, the latter of which is determined solely by probabilities.

Information theory is generally considered to have been founded in 1948 by Claude Shannon in his seminal work, "A Mathematical Theory of Communication"    (SONY VAIO VGN-FZ180E Battery) .

The central paradigm of classical information theory is the engineering problem of the transmission of information over a noisy channel. The most fundamental results of this theory are Shannon's source coding theorem, which establishes that, on average, the number of bits needed to represent the result of an uncertain event is given by its entropy  (SONY VAIO VGN-FZ220E Battery) ;

and Shannon's noisy-channel coding theorem, which states that reliable communication is possible over noisy channels provided that the rate of communication is below a certain threshold, called the channel capacity. The channel capacity can be approached in practice by using appropriate encoding and decoding systems  (SONY VAIO VGN-FZ340E Battery) .

Information theory is closely associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems  (SONY VAIO VGN-FZ430E Battery)  ,

artificial intelligence, complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of many descriptions. Information theory is a broad and deep mathematical theory, with equally broad and deep applications, amongst which is the vital field of coding theory  (SONY VAIO VGN-FZ460E Battery) .

Coding theory is concerned with finding explicit methods, called codes, of increasing the efficiency and reducing the net error rate of data communication over a noisy channel to near the limit that Shannon proved is the maximum possible for that channel. These codes can be roughly subdivided into data compression (source coding) and error-correction (channel coding) techniques (SONY VAIO VGN-FZ480E Battery) .

In the latter case, it took many years to find the methods Shannon's work proved were possible. A third class of information theory codes are cryptographic algorithms (both codes and ciphers). Concepts, methods and results from coding theory and information theory are widely used in cryptography and cryptanalysis  (SONY VAIO VGN-FZ4000 Battery) .

See the article ban (information) for a historical application.

Information theory is also used in information retrieval, intelligence gathering, gambling, statistics, and even in musical composition  (SONY VAIO VGN-FZ31E Battery) .

Historical background

Main article: History of information theory

The landmark event that established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948  (SONY VAIO VGN-FZ31B Battery)  .

Prior to this paper, limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability. Harry Nyquist's 1924 paper, Certain Factors Affecting Telegraph Speed, contains a theoretical section quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system  (SONY VAIO VGN-FZ31J Battery)   ,

giving the relation W = Klogm, where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, and K is a constant. Ralph Hartley's 1928 paper, Transmission of Information, uses the word information as a measurable quantity    (SONY VAIO VGN-FZ31M Battery) ,

reflecting the receiver's ability to distinguish one sequence of symbols from any other, thus quantifying information as H = logSn = nlogS, where S was the number of possible symbols, and n the number of symbols in a transmission. The natural unit of information was therefore the decimal digit, much later renamed the hartley in his honour as a unit or scale or measure of information  (SONY VAIO VGN-FZ31Z Battery) .

Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers.

Much of the mathematics behind information theory with events of different probabilities was developed for the field of thermodynamics by Ludwig Boltzmann and J. Willard Gibbs  (SONY VAIO VGN-FZ38M Battery) .

Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored in Entropy in thermodynamics and information theory  SONY VGP-BPS8 Battery .

In Shannon's revolutionary and groundbreaking paper, the work for which had been substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that SONY VGP-BPS13 Battery .

"The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point."

With it came the ideas of

the information entropy and redundancy of a source, and its relevance through the source coding theorem  SONY VGP-BPS13/S Battery ;

the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem;

the practical result of the Shannon–Hartley law for the channel capacity of a Gaussian channel; as well as

the bit—a new way of seeing the most fundamental unit of information  SONY VGP-BPS13A/B Battery .

Quantities of information

Main article: Quantities of information

Information theory is based on probability theory and statistics. The most important quantities of information are entropy, the information in a random variable, and mutual information, the amount of information in common between two random variablesSONY VGP-BPS13B/B Battery .

The former quantity indicates how easily message data can be compressed while the latter can be used to find the communication rate across a channel.

The choice of logarithmic base in the following formulae determines the unit of information entropy that is used  SONY VGP-BPS13A/S Battery .

The most common unit of information is the bit, based on the binary logarithm. Other units include the nat, which is based on the natural logarithm, and the hartley, which is based on the common logarithm.

In what follows, an expression of the form  is considered by convention to be equal to zero whenever p = 0  SONY VGP-BPS13AS Battery .

This is justified because  for any logarithmic base.

Entropy

Entropy of a Bernoulli trial as a function of success probability, often called the binary entropy function, Hb(p) Dell Inspiron 1320n Battery .

The entropy is maximized at 1 bit per trial when the two possible outcomes are equally probable, as in an unbiased coin toss.

The entropy, H, of a discrete random variable X is a measure of the amount of uncertainty associated with the value of X  (Dell Inspiron 1464 Battery) .

Suppose one transmits 1000 bits (0s and 1s). If these bits are known ahead of transmission (to be a certain value with absolute probability), logic dictates that no information has been transmitted. If, however, each is equally and independently likely to be 0 or 1, 1000 bits (in the information theoretic sense) have been transmitted  (Dell Inspiron 1564 Battery)  .

Between these two extremes, information can be quantified as follows. If  is the set of all messages {x1,...,xn} that X could be, and p(x) is the probability of X given some , then the entropy of X is defined:

(Here, I(x) is the self-information, which is the entropy contribution of an individual message, and  is the expected value  (Dell Inspiron 1764 Battery) .)

An important property of entropy is that it is maximized when all the messages in the message space are equiprobable p(x) = 1 / n,—i.e., most unpredictable—in which case H(X) = logn.

The special case of information entropy for a random variable with two outcomes is the binary entropy function, usually taken to the logarithmic base 2  (Dell Studio 1450 Battery)  :

Joint entropy

The joint entropy of two discrete random variables X and Y is merely the entropy of their pairing: (X,Y). This implies that if X and Y are independent, then their joint entropy is the sum of their individual entropies  (Dell Studio 1457 Battery)  .

For example, if (X,Y) represents the position of a chess piece — X the row and Y the column, then the joint entropy of the row of the piece and the column of the piece will be the entropy of the position of the piece.

Despite similar notation, joint entropy should not be confused with cross entropy  (Dell Latitude D610 Battery)     .

Conditional entropy (equivocation)

The conditional entropy or conditional uncertainty of X given random variable Y (also called the equivocation of X about Y) is the average conditional entropy over Y  (Toshiba NB100 Battery)  :

Because entropy can be conditioned on a random variable or on that random variable being a certain value, care should be taken not to confuse these two definitions of conditional entropy, the former of which is in more common use. A basic property of this form of conditional entropy is that  (Toshiba Satellite M65 battery) :

Mutual information (transinformation)

Mutual information measures the amount of information that can be obtained about one random variable by observing another. It is important in communication where it can be used to maximize the amount of information shared between sent and received signals  (Toshiba Satellite M60 battery) .

The mutual information of X relative to Y is given by:

where SI (Specific mutual Information) is the pointwise mutual information.

A basic property of the mutual information is that

That is, knowing Y, we can save an average of I(X;Y) bits in encoding X compared to not knowing Y  (Dell Latitude D830 Battery) .

Mutual information is symmetric:

Mutual information can be expressed as the average Kullback–Leibler divergence (information gain) of the posterior probability distribution of X given the value of Y to the prior distribution on X  (Dell Latitude D620 Battery)  :

In other words, this is a measure of how much, on the average, the probability distribution on X will change if we are given the value of Y. This is often recalculated as the divergence from the product of the marginal distributions to the actual joint distribution (Dell Inspiron Mini 10 Battery) :

Mutual information is closely related to the log-likelihood ratio test in the context of contingency tables and the multinomial distribution and to Pearson's ?2 test: mutual information can be considered a statistic for assessing independence between a pair of variables, and has a well-specified asymptotic distribution  (Sony VGN-FW11S Battery) .

[edit]Kullback–Leibler divergence (information gain)

The Kullback–Leibler divergence (or information divergence, information gain, or relative entropy) is a way of comparing two distributions: a "true" probability distribution p(X), and an arbitrary probability distribution q(X)  (Sony VGN-FW11M Battery) .

If we compress data in a manner that assumes q(X) is the distribution underlying some data, when, in reality, p(X) is the correct distribution, the Kullback–Leibler divergence is the number of average additional bits per datum necessary for compression. It is thus defined  (Sony VGN-FW139E/H battery).

Although it is sometimes used as a 'distance metric', it is not a true metric since it is not symmetric and does not satisfy the triangle inequality (making it a semi-quasimetric)   (Dell Latitude E5400 Battery) .

Other quantities

Other important information theoretic quantities include Rényi entropy, (a generalization of entropy,) differential entropy, (a generalization of quantities of information to continuous distributions,) and the conditional mutual information  (Dell Latitude E4200 Battery) .

Coding theory

Main article: Coding theory

A picture showing scratches on the readable surface of a CD-R. Music and data CDs are coded using error correcting codes and thus can still be read even if they have minor scratches using error detection and correction  (Dell Vostro A840 Battery) .

Coding theory is one of the most important and direct applications of information theory. It can be subdivided into source coding theory and channel coding theory. Using a statistical description for data, information theory quantifies the number of bits needed to describe the data, which is the information entropy of the source  (Dell Inspiron 300M Battery)   .

Data compression (source coding): There are two formulations for the compression problem:

lossless data compression: the data must be reconstructed exactly (Dell Studio 1737 battery)  ;

lossy data compression: allocates bits needed to reconstruct the data, within a specified fidelity level measured by a distortion function. This subset of Information theory is called rate–distortion theory  (Dell Inspiron E1505 battery)  .

Error-correcting codes (channel coding): While data compression removes as much redundancy as possible, an error correcting code adds just the right kind of redundancy (i.e., error correction) needed to transmit the data efficiently and faithfully across a noisy channel     (Dell RM791 battery) .

This division of coding theory into compression and transmission is justified by the information transmission theorems, or source–channel separation theorems that justify the use of bits as the universal currency for information in many contexts. However, these theorems only hold in the situation where one transmitting user wishes to communicate to one receiving user  (Dell XPS M1530 battery)  .

In scenarios with more than one transmitter (the multiple-access channel), more than one receiver (the broadcast channel) or intermediary "helpers" (the relay channel), or more general networks, compression followed by transmission may no longer be optimal. Network information theory refers to these multi-agent communication models  (Dell XPS M2010 battery)  .

Source theory

Any process that generates successive messages can be considered a source of information. A memoryless source is one in which each message is an independent identically-distributed random variable, whereas the properties of ergodicity and stationarity impose more general constraints      (Acer Aspire One battery) .

All such sources are stochastic. These terms are well studied in their own right outside information theory.

Rate

Information rate is the average entropy per symbol. For memoryless sources, this is merely the entropy of each symbol, while, in the case of a stationary stochastic process, it is that is, the conditional entropy of a symbol given all the previous symbols generated (Toshiba Satellite P10 Battery)    .

For the more general case of a process that is not necessarily stationary, the average rate is that is, the limit of the joint entropy per symbol. For stationary sources, these two expressions give the same result.

It is common in information theory to speak of the "rate" or "entropy" of a language  (SONY VGN-FZ210CE Battery)  .

This is appropriate, for example, when the source of information is English prose. The rate of a source of information is related to its redundancy and how well it can be compressed, the subject of source coding        (Dell Precision M70 Battery) .

Channel capacity

Main article: Channel capacity

Communications over a channel—such as an ethernet cable—is the primary motivation of information theory   (Toshiba Satellite L305 Battery)  .

As anyone who's ever used a telephone (mobile or landline) knows, however, such channels often fail to produce exact reconstruction of a signal; noise, periods of silence, and other forms of signal corruption often degrade quality. How much information can one hope to communicate over a noisy (or otherwise imperfect) channel  (Toshiba Satellite T4900 Battery) ?

Consider the communications process over a discrete channel. A simple model of the process is shown below:

Here X represents the space of messages transmitted, and Y the space of messages received during a unit time over our channel. Let p(y | x) be the conditional probability distribution function of Y given X  (Toshiba PA3399U-2BRS battery)  .

We will consider p(y | x) to be an inherent fixed property of our communications channel (representing the nature of the noise of our channel). Then the joint distribution of X and Y is completely determined by our channel and by our choice of f(x), the marginal distribution of messages we choose to send over the channel  (Toshiba Satellite A200 Battery)  .

Under these constraints, we would like to maximize the rate of information, or the signal, we can communicate over the channel. The appropriate measure for this is the mutual information, and this maximum mutual information is called the channel capacity and is given by    (Toshiba Satellite 1200 Battery)  :

This capacity has the following property related to communicating at information rate R (where R is usually bits per symbol). For any information rate R < C and coding error ? > 0, for large enough N, there exists a code of length N and rate ? R and a decoding algorithm, such that the maximal probability of block error is ? ?  (Toshiba Satellite M300 Battery)  ;

that is, it is always possible to transmit with arbitrarily small block error. In addition, for any rate R > C, it is impossible to transmit with arbitrarily small block error.

Channel coding is concerned with finding such nearly optimal codes that can be used to transmit data over a noisy channel with a small coding error at a rate near the channel capacity   (Sony Vaio PCG-5G2L Battery) .

Capacity of particular channel models

It has been suggested that this article or section be merged into Channel capacity . (Discuss)

A continuous-time analog communications channel subject to Gaussian noise — see Shannon–Hartley theorem  Sony Vaio PCG-5G3L Battery .

A binary symmetric channel (BSC) with crossover probability p is a binary input, binary output channel that flips the input bit with probability p. The BSC has a capacity of 1 ? Hb(p) bits per channel use, where Hb is the binary entropy function to the base 2 logarithm      Sony Vaio PCG-5J1L Battery :

A binary erasure channel (BEC) with erasure probability p is a binary input, ternary output channel. The possible channel outputs are 0, 1, and a third symbol 'e' called an erasure. The erasure represents complete loss of information about an input bit. The capacity of the BEC is 1 - p bits per channel use      Sony Vaio PCG-5K2L Battery .

Applications to other fields

Intelligence uses and secrecy applications

Information theoretic concepts apply to cryptography and cryptanalysis. Turing's information unit, the ban, was used in the Ultra project, breaking the German Enigma machine code and hastening the end of WWII in Europe  Sony Vaio PCG-5J2L Battery .

Shannon himself defined an important concept now called the unicity distance. Based on the redundancy of the plaintext, it attempts to give a minimum amount of ciphertext necessary to ensure unique decipherability.

Information theory leads us to believe it is much more difficult to keep secrets than it might first appear  Sony Vaio PCG-5K1L Battery.

A brute force attack can break systems based on asymmetric key algorithms or on most commonly used methods of symmetric key algorithms (sometimes called secret key algorithms), such as block ciphers. The security of all such methods currently comes from the assumption that no known attack can break them in a practical amount of time  Sony Vaio PCG-5L1L Battery .

Information theoretic security refers to methods such as the one-time pad that are not vulnerable to such brute force attacks. In such cases, the positive conditional mutual information between the plaintext and ciphertext (conditioned on the key) can ensure proper transmission     Sony Vaio PCG-6S2L Battery ,

while the unconditional mutual information between the plaintext and ciphertext remains zero, resulting in absolutely secure communications. In other words, an eavesdropper would not be able to improve his or her guess of the plaintext by gaining knowledge of the ciphertext but not of the key   Sony Vaio PCG-6S3L Battery .

However, as in any other cryptographic system, care must be used to correctly apply even information-theoretically secure methods; the Venona project was able to crack the one-time pads of the Soviet Union due to their improper reuse of key material  Sony Vaio PCG-6V1L Battery .

Pseudorandom number generation

Pseudorandom number generators are widely available in computer language libraries and application programs. They are, almost universally, unsuited to cryptographic use as they do not evade the deterministic nature of modern computer equipment and software    Sony Vaio PCG-6W1L Battery .

A class of improved random number generators is termed cryptographically secure pseudorandom number generators, but even they require external to the software random seeds to work as intended. These can be obtained via extractors, if done carefully. The measure of sufficient randomness in extractors is min-entropy      Sony Vaio PCG-6W2L Battery ,

a value related to Shannon entropy through Rényi entropy; Rényi entropy is also used in evaluating randomness in cryptographic systems. Although related, the distinctions among these measures mean that a random variable with high Shannon entropy is not necessarily satisfactory for use in an extractor and so for cryptography uses Sony Vaio PCG-6W3L Battery .   .

Seismic exploration

One early commercial application of information theory was in the field seismic oil exploration. Work in this field made it possible to strip off and separate the unwanted noise from the desired seismic signal. Information theory and digital signal processing offer a major improvement of resolution and image clarity over previous analog methods Sony Vaio PCG-7111L Battery .

Miscellaneous applications

Information theory also has applications in gambling and investing, black holes, bioinformatics, and music.

Informatics is the science of information, the practice of information processing, and the engineering of information systems. Informatics studies the structure, algorithms, behavior, and interactions of natural and artificial systems that store, process, access and communicate information  Sony Vaio PCG-7112L Battery .

It also develops its own conceptual and theoretical foundations and utilizes foundations developed in other fields. Since the advent of computers, individuals and organizations increasingly process information digitally. This has led to the study of informatics that has computational, cognitive and social aspects, including study of the social impact of information technologies  Sony Vaio PCG-7113L Battery .

Etymology

In 1957 the German computer scientist Karl Steinbuch coined the word Informatik by publishing a paper called Informatik: Automatische Informationsverarbeitung ("Informatics: Automatic Information Processing")  Sony Vaio PCG-7133L Battery .

The English term Informatics is sometimes understood as meaning the same as computer science. However, the German word Informatik is the correct translation of English computer science Sony Vaio PCG-7Z2L Battery

The French term informatique was coined in 1962 by Philippe Dreyfus together with various translations—informatics (English), also proposed independently and simultaneously by Walter F. Bauerand associates who co-founded Informatics Inc., and informatica(Italian, Spanish, Romanian, Portuguese, Dutch), referring to the application of computers to store and process information  Sony Vaio PCG-8Y1L Battery .

The term was coined as a combination of "information" and "automatic" to describe the science of automating information interactions      Sony Vaio PCG-8Y2L Battery .

The morphology—informat-ion + -ics—uses "the accepted form for names of sciences, as conics, linguistics, optics, or matters of practice, as economics, politics, tactics", and so, linguistically, the meaning extends easily to encompass both the science of information and the practice of information processing  Sony Vaio PCG-8Z1L Battery .

The naming for computer science is derived from the concept of computation which may or may not involve the existence of information. For example, quantum computation and digital logic do not involve information  Sony Vaio PCG-8Z2L Battery .

History

This new term was adopted across Western Europe, and, except in English, developed a meaning roughly translated by the English ‘computer science’, or ‘computing science’  Sony VAIO PCG-5G2L Battery .

Mikhailov et al. advocated the Russian term informatika (1966), and the English informatics (1967), as names for the theory of scientific information, and argued for a broader meaning, including study of the use of information technology in various communities (for example, scientific) and of the interaction of technology and human organizational structures  Sony VAIO PCG-5G3L Battery .

Informatics is the discipline of science which investigates the structure and properties (not specific content) of scientific information, as well as the regularities of scientific information activity, its theory, history, methodology and organization.

Usage has since modified this definition in three ways   Sony VAIO PCG-5J1L Battery .

First, the restriction to scientific information is removed, as in business informatics or legal informatics. Second, since most information is now digitally stored, computation is now central to informatics. Third, the representation, processing and communication of information are added as objects of investigation  Sony VAIO PCG-5K2L Battery ,

since they have been recognized as fundamental to any scientific account of information. Taking information as the central focus of study, then, distinguishes informatics, which includes study of biological and social mechanisms of information processing, from computer science, where digital computation plays a distinguished central role  Sony VAIO PCG-5J2L Battery .

Similarly, in the study of representation and communication, informatics is indifferent to the substrate that carries information. For example, it encompasses the study of communication using gesture, speech and language, as well as digital communications and networking    Sony VAIO PCG-5K1L Battery .

The first example of a degree level qualification in Informatics occurred in 1982 when Plymouth Polytechnic (now the University of Plymouth) offered a four year BSc(Honours) degree in Computing and Informatics – with an initial intake of only 35 students. The course still runs today making it the longest available qualification in the subject       Sony VAIO PCG-5L1L Battery .

A broad interpretation of informatics, as "the study of the structure, algorithms, behaviour, and interactions of natural and artificial computational systems," was introduced by the University of Edinburghin 1994 when it formed the grouping that is now its School of Informatics. This meaning is now (2006) increasingly used in the United Kingdom  Sony VAIO PCG-6S2L Battery .

Informatics encompasses the study of systems that represent, process, and communicate information. However, the theory of computation in the specific discipline of theoretical computer sciencewhich evolved from Alan Turing studies the notion of a complex system regardless if information actually exists    Sony VAIO PCG-6S3L Battery .

Since both fields process information, there is some disagreement among scientists as to field hierarchy; for example Arizona State University attempted to adopt a broader definition of informatics to even encompass cognitive science at the launch of its School of Computing and Informatics in September 2006      Sony VAIO PCG-6V1L Battery .

The confusion arises since information can be easily stored on a computer and hence informatics could be considered the parent of computer science. However, the original notion of a computer was the name given to the action of computation regardless of the existence of information or the existence of a von neumann architecture   Sony VAIO PCG-6W1L Battery .

Humans are examples of computational systems and not information systems. Many fields such as quantum computing theory are studied in theoretical computer science but not related to informatics.

The 2008 Research Assessment Exercise, of the UK Funding Councils, includes a new, Computer Science and Informatics, unit of assessment (UoA), whose scope is described as follows    Sony VAIO PCG-6W2L Battery :

The UoA includes the study of methods for acquiring, storing, processing, communicating and reasoning about information, and the role of interactivity in natural and artificial systems, through the implementation, organisation and use of computer hardware, software and other resources Sony VAIO PCG-6W3L Battery .

The subjects are characterised by the rigorous application of analysis, experimentation and design.

At the Indiana University School of Informatics (Bloomington, Indianapolis and Southeast), informatics is defined as "the art, science and human dimensions of information technology" and "the study, application, and social consequences of technology Sony VAIO PCG-7111L Battery .

" It is also defined in Informatics I101, Introduction to Informatics as "the application of information technology to the arts, sciences, and professions." These definitions are widely accepted in the United States, and differ from British usage in omitting the study of natural computation   Sony VAIO PCG-7112L Battery .

At the University of California, Irvine Department of Informatics, informatics is defined as "the interdisciplinary study of the design, application, use and impact of information technology. The discipline of informatics is based on the recognition that the design of this technology is not solely a technical matter    Sony VAIO PCG-7113L Battery ,

but must focus on the relationship between the technology and its use in real-world settings. That is, informatics designs solutions in context, and takes into account the social, cultural and organizational settings in which computing and information technology will be used     Sony VAIO PCG-7133L Battery ."

At the University of Michigan, Ann Arbor Informatics interdisciplinary major, informatics is defined as "the study of information and the ways information is used by and affects human beings and social systems. Key to this growing field is that it applies both technological and social perspectives to the study of information   Sony VAIO PCG-7Z1L Battery .

Michigan's interdisciplinary approach to teaching Informatics gives you a solid grounding in contemporary computer programming, mathematics, and statistics, combined with study of the ethical and social science aspects of complex information systems. Experts in the field help design new information technology tools for specific scientific, business, and cultural needs  Sony VAIO PCG-7Z2L Battery ."

In the English-speaking world the term informatics was first widely used in the compound, ‘medical informatics’, taken to include "the cognitive, information processing, and communication tasks of medical practice, education, and research, including information science and the technology to support these tasks"  Sony VAIO PCG-8Y1L Battery .

Many such compounds are now in use; they can be viewed as different areas of applied informatics.

One of the most significant areas of applied informatics is that of organisational informatics  Sony VAIO PCG-8Y2L Battery .

Organisational informatics is fundamentally interested in the application of information, information systems and ICT within organisations of various forms including private sector, public sector and voluntary sector organisations. As such, organisational informatics can be seen to be sub-category of social informatics and a super-category of business informatics  Sony VAIO PCG-8Z1L Battery .

A practitioner of informatics may be called an informatician or an informaticist.

In 1989, the first International Olympiad in Informatics (IOI) was held in Bulgaria  Sony VAIO PCG-8Z2L Battery .

The olympiad involves two days of intense competition for five hours each day. Four students are selected from each participating country to attend and compete for Gold, Silver and Bronze medals. The 2008 IOI was held in Cairo, Egypt  WD passport essential (500GB/640GB) .

Partager cet article

Repost 0
Published by batterys - dans Laptop Battery
commenter cet article

commentaires

Business IT Support 01/10/2016 09:38

IT Risk Managers LLC is playing a big role in Access Control Systems and Apartment Intercom systems and Business IT Support throughout Chicago.