'Computer'에 해당되는 글 568건

  1. 2008.04.10 Craig interpolation by 알 수 없는 사용자
  2. 2008.04.10 Compactness theorem by 알 수 없는 사용자
  3. 2008.04.10 Finite group by 알 수 없는 사용자
  4. 2008.04.10 Field (mathematics) by 알 수 없는 사용자
  5. 2008.04.10 Ring theory by 알 수 없는 사용자
  6. 2008.04.10 Commutative ring by 알 수 없는 사용자
  7. 2008.04.10 Factorization by 알 수 없는 사용자
  8. 2008.04.10 Number theory by 알 수 없는 사용자
  9. 2008.04.10 Combinatorics by 알 수 없는 사용자
  10. 2008.04.10 PDP-11 by 알 수 없는 사용자

Craig interpolation

Computer/Terms 2008. 4. 10. 11:04

Depending on the type of logic being considered, the definition of Craig interpolation varies. It was first proved in 1957 for first-order logic by William Craig. Propositional Craig interpolation can be defined as follows:

If

X -> Y
 
is a tautology and there exists a formula Z such that all propositional variables of Z occur in both X and Y, and

X -> Z

and

Z -> Y
 
are tautologies, then Z is called an interpolant for

X -> Y.

A simple example is that of P as an interpolant for

P∧R -> P∨Q.

In propositional logic, the Craig interpolation lemma says that whenever an implication

X -> Y
 
is a tautology, there exists an interpolant.

Reference:
http://en.wikipedia.org/wiki/Craig_interpolation_lemma

Posted by 알 수 없는 사용자
,

Compactness theorem

Computer/Terms 2008. 4. 10. 10:52

In mathematical logic, the compactness theorem states that a (possibly infinite) set of first-order sentences has a model, iff every finite subset of it has a model. There is a generalization of compactness for languages that are of higher order than first-order ones. With respect to theories based on logics that are strictly stronger than first-order logic, compactness is seen to be too strong a property.

The compactness theorem for the propositional calculus is a result of Tychonoff's theorem (which says that the product of compact spaces is compact) applied to compact Stone spaces; hence the theorem's name.

Reference:
http://en.wikipedia.org/wiki/Compactness_theorem

Posted by 알 수 없는 사용자
,

Finite group

Computer/Terms 2008. 4. 10. 10:48

In mathematics, a finite group is a group which has finitely many elements. Some aspects of the theory of finite groups were investigated in great depth in the twentieth century, in particular the local theory, and the theory of solvable groups and nilpotent groups. It is too much to hope for a complete determination of the structure of all finite groups: the number of possible structures soon becomes overwhelming. However, the twentieth century saw the classification of the finite simple groups, which may be viewed as the determination of the "building blocks" for all finite groups, as each finite group has a composition series.

Thanks to the work of mathematicians such Chevalley and Steinberg, the second half of the twentieth century also saw increased understanding of finite analogs of classical groups, and other related groups. One such family of groups is the family of general linear groups over finite fields. The group theorist J. L. Alperin has written that "The typical example of a finite group is GL(n,q), the general linear group of n dimensions over the field with q elements. The student who is introduced to the subject with other examples is being completely misled."

Finite groups often occur when considering symmetry of mathematical or physical objects, when those objects admit just a finite number of structure preserving transformations. The theory of Lie groups, which may be viewed as dealing with "continuous symmetry", is strongly influenced by the associated Weyl groups. These are finite groups generated by reflections which act on a finite dimensional Euclidean space. Thus properties of finite groups can play a role in subjects such as theoretical physics.

Reference:
http://en.wikipedia.org/wiki/Finite_group

Posted by 알 수 없는 사용자
,

Field (mathematics)

Computer/Terms 2008. 4. 10. 10:40

In abstract algebra, a field is an algebraic structure in which the operations of addition, subtraction, multiplication and division (except division by zero) may be performed, and the same rules hold which are familiar from the arithmetic of ordinary numbers.

All fields are rings, but not conversely. Fields differ from rings most importantly in the requirement that division be possible, but also, in modern definitions, by the requirement that the multiplication operation in a field be commutative. Otherwise the structure is a so-called skew field (better known as a division ring), although historically division rings were called fields and fields were commutative fields.

The prototypical example of a field is Q, the field of rational numbers. Other important examples include the field of real numbers R, the field of complex numbers C and, for any prime number p, the finite field of integers modulo p, denoted Z/pZ, F_p or GF(p). For any field K, the set K(X) of rational functions with coefficients in K is also a field.

The mathematical discipline concerned with the study of fields is called field theory.

Reference:
http://en.wikipedia.org/wiki/Field_%28mathematics%29

Posted by 알 수 없는 사용자
,

Ring theory

Computer/Terms 2008. 4. 10. 10:35

In mathematics, ring theory is the study of rings, algebraic structures in which addition and multiplication are defined and have similar properties to those familiar from the integers. Ring theory studies the structure of rings, their representations, or, in different language, modules, special classes of rings (group rings, division rings, universal enveloping algebras), as well as an array of properties that proved to be of interest both within the theory itself and for its applications, such as homological properties and polynomial identities.

Commutative rings are much better understood than noncommutative ones. Due to its intimate connections with algebraic geometry and algebraic number theory, which provide many natural examples of commutative rings, their theory, which is considered to be part of commutative algebra and field theory rather than of general ring theory, is quite different in flavour from the theory of their noncommutative counterparts. A fairly recent trend, started in the 1980s with the development of noncommutative geometry and with the discovery of quantum groups, attempts to turn the situation around and build the theory of certain classes of noncommutative rings in a geometric fashion as if they were rings of functions on (non-existent) 'noncommutative spaces'.

Please refer to the glossary of ring theory for the definitions of terms used throughout ring theory.

Reference:
http://en.wikipedia.org/wiki/Ring_theory

Posted by 알 수 없는 사용자
,

Commutative ring

Computer/Terms 2008. 4. 10. 10:27

In ring theory, a branch of abstract algebra, a commutative ring is a ring in which the multiplication operation obeys the commutative law. This means that if a and b are any elements of the ring, then a×b=b×a.

The study of commutative rings is called commutative algebra.

Reference:
http://en.wikipedia.org/wiki/Commutative_ring

Posted by 알 수 없는 사용자
,

Factorization

Computer/Terms 2008. 4. 10. 10:19

In mathematics, factorization (also factorisation in British English) or factoring is the decomposition of an object (for example, a number, a polynomial, or a matrix) into a product of other objects, or factors, which when multiplied together give the original. For example, the number 15 factors into primes as 3 × 5, and the polynomial x^2 − 4 factors as (x − 2)(x + 2). In all cases, a product of simpler objects is obtained.

The aim of factoring is usually to reduce something to "basic building blocks," such as numbers to prime numbers, or polynomials to irreducible polynomials. Factoring integers is covered by the fundamental theorem of arithmetic and factoring polynomials by the fundamental theorem of algebra.

The opposite of factorization is expansion. This is the process of multiplying together factors to recreate the original, "expanded" polynomial.

Integer factorization for large integers appears to be a difficult problem. There is no known method to carry it out quickly. Its complexity is the basis of the assumed security of some public key cryptography algorithms, such as RSA.

A matrix can also be factorized into a product of matrices of special types, for an application in which that form is convenient. One major example of this uses an orthogonal or unitary matrix, and a triangular matrix. There are different types: QR decomposition, LQ, QL, RQ, RZ.

Another example is the factorization of a function as the composition of other functions having certain properties; for example, every function can be viewed as the composition of a surjective function with an injective function.

Reference:
http://en.wikipedia.org/wiki/Factorization

Posted by 알 수 없는 사용자
,

Number theory

Computer/Terms 2008. 4. 10. 10:12

Number theory is the branch of pure mathematics concerned with the properties of numbers in general, and integers in particular, as well as the wider classes of problems that arise from their study.

Number theory may be subdivided into several fields, according to the methods used and the type of questions investigated. (See the list of number theory topics.)

The term "arithmetic" is also used to refer to number theory. This is a somewhat older term, which is no longer as popular as it once was. Number theory used to be called the higher arithmetic, but this too is dropping out of use. Nevertheless, it still shows up in the names of mathematical fields (arithmetic functions, arithmetic of elliptic curves, fundamental theorem of arithmetic). This sense of the term arithmetic should not be confused either with elementary arithmetic, or with the branch of logic which studies Peano arithmetic as a formal system. Mathematicians working in the field of number theory are called number theorists.

Reference:
http://en.wikipedia.org/wiki/Number_theory

Posted by 알 수 없는 사용자
,

Combinatorics

Computer/Terms 2008. 4. 10. 10:04

Combinatorics is a branch of pure mathematics concerning the study of discrete (and usually finite) objects. It is related to many other areas of mathematics, such as algebra, probability theory, ergodic theory and geometry, as well as to applied subjects in computer science and statistical physics. Aspects of combinatorics include "counting" the objects satisfying certain criteria (enumerative combinatorics), deciding when the criteria can be met, and constructing and analyzing objects meeting the criteria (as in combinatorial designs and matroid theory), finding "largest", "smallest", or "optimal" objects (extremal combinatorics and combinatorial optimization), and finding algebraic structures these objects may have (algebraic combinatorics).

Combinatorics is as much about problem solving as theory building, though it has developed powerful theoretical methods, especially since the later twentieth century (see the page List of combinatorics topics for details of the more recent development of the subject). One of the oldest and most accessible parts of combinatorics is graph theory, which also has numerous natural connections to other areas.

There are many combinatorial patterns and theorems related to the structure of combinatoric sets. These often focus on a partition or ordered partition of a set. See the List of partition topics for an expanded list of related topics or the List of combinatorics topics for a more general listing. Some of the more notable results are highlighted below.

An example of a simple combinatorial question is the following: What is the number of possible orderings of a deck of 52 distinct playing cards? The answer is 52! (52 factorial), which is equal to about 8.0658 × 10^67.

Another example of a more difficult problem: Given a certain number n of people, is it possible to assign them to sets so that each person is in at least one set, each pair of people is in exactly one set together, every two sets have exactly one person in common, and no set contains everyone, all but one person, or exactly one person? The answer depends on n. See "Design theory" below.

Combinatorics is used frequently in computer science to obtain estimates on the number of elements of certain sets. A mathematician who studies combinatorics is often referred to as a combinatorialist or combinatorist.

Reference:
http://en.wikipedia.org/wiki/Combinatorics

Posted by 알 수 없는 사용자
,

PDP-11

Computer/Terms 2008. 4. 10. 09:55

The PDP-11 was a series of 16-bit minicomputers sold by Digital Equipment Corp. in the 1970s and 1980s. The PDP-11 was a successor to DEC's PDP-8 computer in the PDP series of computers. It had several uniquely innovative features, and was easier to program than its predecessors. It was well-liked by programmers, and it was replaced in the mid-range minicomputer niche by the VAX-11 32-bit extension of the PDP-11. Much of the market for both machines would be taken by personal computers, including the IBM PC and Apple II, and workstations, such as those from Sun Microsystems.

Reference:
http://en.wikipedia.org/wiki/Pdp-11
Posted by 알 수 없는 사용자
,