<P> The reals are uncountable; that is: while both the set of all natural numbers and the set of all real numbers are infinite sets, there can be no one - to - one function from the real numbers to the natural numbers: the cardinality of the set of all real numbers (denoted c (\ displaystyle (\ mathfrak (c))) and called cardinality of the continuum) is strictly greater than the cardinality of the set of all natural numbers (denoted א 0 (\ displaystyle \ aleph _ (0))' aleph - naught'). The statement that there is no subset of the reals with cardinality strictly greater than א 0 (\ displaystyle \ aleph _ (0)) and strictly smaller than c (\ displaystyle (\ mathfrak (c))) is known as the continuum hypothesis (CH). It is known to be neither provable nor refutable using the axioms of Zermelo--Fraenkel set theory including the axiom of choice (ZFC), the standard foundation of modern mathematics, in the sense that some models of ZFC satisfy CH, while others violate it . </P> <P> Simple fractions were used by the Egyptians around 1000 BC; the Vedic "Sulba Sutras" ("The rules of chords") in, c. 600 BC, include what may be the first "use" of irrational numbers . The concept of irrationality was implicitly accepted by early Indian mathematicians since Manava (c. 750--690 BC), who were aware that the square roots of certain numbers such as 2 and 61 could not be exactly determined . Around 500 BC, the Greek mathematicians led by Pythagoras realized the need for irrational numbers, in particular the irrationality of the square root of 2 . </P> <P> The Middle Ages brought the acceptance of zero, negative, integral, and fractional numbers, first by Indian and Chinese mathematicians, and then by Arabic mathematicians, who were also the first to treat irrational numbers as algebraic objects, which was made possible by the development of algebra . Arabic mathematicians merged the concepts of "number" and "magnitude" into a more general idea of real numbers . The Egyptian mathematician Abū Kāmil Shujā ibn Aslam (c. 850--930) was the first to accept irrational numbers as solutions to quadratic equations or as coefficients in an equation, often in the form of square roots, cube roots and fourth roots . </P> <P> In the 16th century, Simon Stevin created the basis for modern decimal notation, and insisted that there is no difference between rational and irrational numbers in this regard . </P>

X is in the set of real numbers