'Computer'에 해당되는 글 568건

  1. 2008.04.10 Continuum hypothesis by 알 수 없는 사용자
  2. 2008.04.10 Calculus of constructions by 알 수 없는 사용자
  3. 2008.04.10 Axiomatic system by 알 수 없는 사용자
  4. 2008.04.10 Proof calculus by 알 수 없는 사용자
  5. 2008.04.10 Well-behaved by 알 수 없는 사용자
  6. 2008.04.10 Type theory by 알 수 없는 사용자
  7. 2008.04.10 Descriptive complexity by 알 수 없는 사용자
  8. 2008.04.10 Least fixed point by 알 수 없는 사용자
  9. 2008.04.10 Transitive closure by 알 수 없는 사용자
  10. 2008.04.10 Löwenheim–Skolem theorem by 알 수 없는 사용자

In mathematics, the continuum hypothesis (abbreviated CH) is a hypothesis, advanced by Georg Cantor, about the possible sizes of infinite sets. Cantor introduced the concept of cardinality to compare the sizes of infinite sets, and he gave two proofs that the cardinality of the set of integers is strictly smaller than that of the set of real numbers. His proofs, however, give no indication of the extent to which the cardinality of the natural numbers is less than that of the real numbers. Cantor proposed the continuum hypothesis as a possible solution to this question. It states:

There is no set whose size is strictly between that of the integers and that of the real numbers.

In light of Cantor's theorem that the sizes of these sets cannot be equal, this hypothesis states that the set of real numbers has minimal possible cardinality which is greater than the cardinality of the set of integers. The name of the hypothesis comes from the term the continuum for the real numbers.

Equivalently, as the cardinality of the integers is N_0 ("aleph-null") and the cardinality of the real numbers is 2^N_0, the continuum hypothesis says that there is no set S for which N_0 < |S| < 2^N_0. Assuming the axiom of choice, there is a smallest cardinal number N_1 greater than N_0, and the continuum hypothesis is in turn equivalent to the equality 2^N_0 = N_1.
 
There is also a generalization of the continuum hypothesis called the generalized continuum hypothesis (GCH) saying:

For all ordinals α, 2^N_α = N_(α+1).

Reference:
http://en.wikipedia.org/wiki/Continuum_hypothesis

Posted by 알 수 없는 사용자
,

The calculus of constructions (CoC) is a higher-order typed lambda calculus, initially developed by Thierry Coquand, where types are first-class values. It is thus possible, within the CoC, to define functions from, say, integers to types, types to types as well as functions from integers to integers.

The CoC is strongly normalizing, though, by Gödel's incompleteness theorem, it is impossible to prove this property within the CoC since it implies consistency.

The CoC was the basis of the early versions of the Coq theorem prover; later versions were built upon the Calculus of inductive constructions, an extension of CoC with native support for inductive datatypes. In the original CoC, inductive datatypes had to be emulated as their polymorphic destructor function.

Reference:
http://en.wikipedia.org/wiki/Calculus_of_constructions

Posted by 알 수 없는 사용자
,

Axiomatic system

Computer/Terms 2008. 4. 10. 15:08

In mathematics, an axiomatic system is any set of axioms from which some or all axioms can be used in conjunction to logically derive theorems. A mathematical theory consists of an axiomatic system and all its derived theorems. An axiomatic system that is completely described is a special kind of formal system; usually though the effort towards complete formalisation brings diminishing returns in certainty, and a lack of readability for humans. Therefore discussion of axiomatic systems is normally only semi-formal. A formal theory typically means an axiomatic system, for example formulated within model theory. A formal proof is a complete rendition of a mathematical proof within a formal system.

Reference:
http://en.wikipedia.org/wiki/Axiomatic_system
Posted by 알 수 없는 사용자
,

Proof calculus

Computer/Terms 2008. 4. 10. 14:42

In mathematical logic, a proof calculus corresponds to a family of formal systems that use a common style of formal inference for its inference rules. The specific inference rules of a member of such a family characterize the theory of a logic.

Usually a given proof calculus encompasses more than a single particular formal system, since many proof calculi can be used for radically different logics. For example, a paradigmatic case as the sequent calculus can be used to express the consequence relations of both intuitionistic logic and relevance logic. Thus, loosely speaking, a proof calculus is a template or design pattern, characterized by a certain style of formal inference, that may be specialized to produce specific formal systems, namely by specifying the actual inference rules for such a system. There is no consensus among logicians on how best to define the term.

Reference:
http://en.wikipedia.org/wiki/Proof_calculus

Posted by 알 수 없는 사용자
,

Well-behaved

Computer/Terms 2008. 4. 10. 13:19

Mathematicians (and those in related sciences) very frequently speak of whether a mathematical object — a number, a function, a set, a space of one sort or another — is "well-behaved" or not. While the term has no fixed formal definition, it can have fairly precise meaning within a given context.

In pure mathematics, "well-behaved" objects are those that can be proved or analyzed by elegant means to have elegant properties.

In both pure and applied mathematics (optimization, numerical integration, or mathematical physics, for example), well-behaved also means not violating any assumptions needed to successfully apply whatever analysis is being discussed.

The opposite case is usually labeled pathological. It is not unusual to have situations in which most cases (in terms of cardinality) are pathological, but the pathological cases will not arise in practice unless constructed deliberately. (Of course, in these matters of taste one person's "well-behaved" vs. "pathological" dichotomy is usually some other person's division into "trivial" vs. "interesting".)

Reference:
http://en.wikipedia.org/wiki/Well-behaved

Posted by 알 수 없는 사용자
,

Type theory

Computer/Terms 2008. 4. 10. 12:32

In mathematics, logic and computer science, type theory is any of several formal systems that can serve as alternatives to naive set theory, or the study of such formalisms in general. In programming language theory, a branch of computer science, type theory can refer to the design, analysis and study of type systems, although some computer scientists limit the term's meaning to the study of abstract formalisms such as typed λ-calculi.

Bertrand Russell invented the first type theory in response to his discovery that Gottlob Frege's version of naive set theory was afflicted with Russell's paradox. This type theory features prominently in Whitehead and Russell's Principia Mathematica. It avoids Russell's paradox by first creating a hierarchy of types, then assigning each mathematical (and possibly other) entity to a type. Objects of a given type are built exclusively from objects of preceding types (those lower in the hierarchy), thus preventing loops.

Alonzo Church, inventor of the lambda calculus, developed a higher-order logic commonly called Church's Theory of Types, in order to avoid the Kleene-Rosser paradox afflicting the original pure lambda calculus. Church's type theory is a variant of the lambda calculus in which expressions (also called formulas or λ-terms) are classified into types, and the types of expressions restrict the ways in which they can be combined. In other words, it is a typed lambda calculus. Today many other such calculi are in use, including Per Martin-Löf's Intuitionistic type theory, Jean-Yves Girard's System F and the Calculus of Constructions. In typed lambda calculi, types play a role similar to that of sets in set theory

Reference:
http://en.wikipedia.org/wiki/Type_theory

Posted by 알 수 없는 사용자
,

Descriptive complexity is a branch of finite model theory, a subfield of computational complexity theory and mathematical logic, which seeks to characterize complexity classes by the type of logic needed to express the languages in them. For example, PH, the union of all complexity classes in the polynomial hierarchy, is precisely the class of languages expressible by statements of second-order logic. This connection between complexity and logic allows results to be transferred easily from one area to the other, facilitating new proof methods and providing additional evidence that the main complexity classes are somehow "natural" and not tied to the specific abstract machines used to define them.

Specifically, each logical system produces a set of queries expressible in it. The queries correspond to the computational problems of traditional complexity theory.

The first main result of descriptive complexity was Fagin's theorem, shown by Ronald Fagin in 1974, which established that NP is precisely the set of languages expressible by sentences of existential second-order logic; that is, second order logic excluding universal quantification over relations, functions, and subsets. Many other classes were later characterized in such a manner, most of them by Neil Immerman:

First-order logic defines the class FO, corresponding to AC^0, the languages recognized by polynomial-size circuits of bounded depth, which equals the languages recognized by a concurrent random access machine in constant time.
First-order logic with a commutative, transitive closure operator added yields SL, which equals L, problems solvable in logarithmic space.
First-order logic with a transitive closure operator yields NL, the problems solvable in nondeterministic logarithmic space.
In the presence of linear order, first-order logic with a least fixed point operator gives P, the problems solvable in deterministic polynomial time.
Existential second-order logic yields NP, as mentioned above.
Universal second-order logic (excluding existential second-order quantification) yields co-NP.
Second-order logic corresponds to PH.
Second-order logic with a transitive closure (commutative or not) yields PSPACE, the problems solvable in polynomial space.
Second-order logic with a least fixed point operator gives EXPTIME, the problems solvable in exponential time.

Reference:
http://en.wikipedia.org/wiki/Descriptive_complexity

Posted by 알 수 없는 사용자
,

Least fixed point

Computer/Terms 2008. 4. 10. 11:32

In order theory, a branch of mathematics, the least fixed point (lfp or LFP) of a function is the fixed point which is less than or equal to all other fixed points, according to some partial order.

For example, the least fixed point of the real function

f(x) = x^2
is x = 0 with the usual order on the real numbers. Many fixed-point theorems yield algorithms for locating the least fixed point. Least fixed points often have desirable properties that arbitrary fixed points do not.

In mathematical logic, the least fixed point is somehow related to making recursive definitions. Immerman and Vardi independently showed the descriptive complexity result that the polynomial-time computable properties of linearly ordered structures are definable in LFP. However, LFP is too weak to express all polynomial-time properties of unordered structures (for instance that a structure has even size).

Reference:
http://en.wikipedia.org/wiki/Least_fixed_point

Posted by 알 수 없는 사용자
,

Transitive closure

Computer/Terms 2008. 4. 10. 11:28

In mathematics, the transitive closure of a binary relation R on a set X is the smallest transitive relation on X that contains R.

For example, if X is the set of humans (alive or dead) and R is the relation 'parent of', then the transitive closure of R is the relation "x is an ancestor of y". Or, if X is a set of airports and xRy means "there is a direct flight from airport x to airport y", then the transitive closure of R is the relation "it is possible to fly from x to y in one or more flights."

Reference;
http://en.wikipedia.org/wiki/Transitive_closure

Posted by 알 수 없는 사용자
,

In mathematical logic, the Löwenheim-Skolem theorem states that if a countable first-order theory has an infinite model, then for every infinite cardinal number κ it has a model of size κ. The result implies that first-order theories are unable to control the cardinality of their infinite models, and that no first-order theory with an infinite model can have exactly one model up to isomorphism.

Reference:
http://en.wikipedia.org/wiki/L%C3%B6wenheim%E2%80%93Skolem_theorem
Posted by 알 수 없는 사용자
,