2. Mathematics (from Greek μάθημα máthēma, “knowledge,
study, learning”) is the abstract study of subjects
encompassing quantity,[2] structure,[3]space,[2] change,[4][5]
and more;[6] it has no generally accepted definition.[7][8]
Mathematicians seek out patterns[9][10] and formulate
new conjectures. Mathematicians resolve the truth or
falsity of conjectures by mathematical proof. The research
required to solve mathematical problems can take years
or even centuries of sustained inquiry. Since the pioneering
work of Giuseppe Peano (1858–1932), David Hilbert (1862–
1943), and others on axiomatic systems in the late 19th
century, it has become customary to view mathematical
research as establishing truth by rigorous deduction from
appropriately chosen axioms and definitions. When those
mathematical structures are good models of real
phenomena, then mathematical reasoning often provides
insight or predictions about nature.
3. Through the use of abstraction and logical reasoning, mathematics developed
from counting, calculation, measurement, and the systematic study of
theshapes and motions of physical objects. Practical mathematics has been a
human activity for as far back as written records exist. Rigorous argumentsfirst
appeared in Greek mathematics, most notably in Euclid's Elements.
Mathematics developed at a relatively slow pace until the Renaissance, when
mathematical innovations interacting with new scientific discoveries led to a
rapid increase in the rate of mathematical discovery that has continued to the
present day.[11]
Galileo Galilei (1564–1642) said, 'The universe cannot be read until we have
learned the language and become familiar with the characters in which it is
written. It is written in mathematical language, and the letters are triangles,
circles and other geometrical figures, without which means it is humanly
impossible to comprehend a single word. Without these, one is wandering
about in a dark labyrinth'.[12] Carl Friedrich Gauss (1777–1855) referred to
mathematics as "the Queen of the Sciences".[13] Benjamin Peirce (1809–1880)
called mathematics "the science that draws necessary conclusions".[14] David
Hilbert said of mathematics: "We are not speaking here of arbitrariness in any
sense. Mathematics is not like a game whose tasks are determined by arbitrarily
stipulated rules. Rather, it is a conceptual system possessing internal necessity
that can only be so and by no means otherwise."[15] Albert Einstein (1879–1955)
stated that "as far as the laws of mathematics refer to reality, they are not
certain; and as far as they are certain, they do not refer to reality".[16]
4. Mathematics is used throughout the world as
an essential tool in many fields,
including natural
science, engineering, medicine, and the social
sciences. Applied mathematics, the branch of
mathematics concerned with application of
mathematical knowledge to other fields,
inspires and makes use of new mathematical
discoveries, which has led to the development
of entirely new mathematical disciplines, such
as statistics and game theory. Mathematicians
also engage in pure mathematics, or
mathematics for its own sake, without having
any application in mind. There is no clear line
separating pure and applied mathematics,
and practical applications for what began as
pure mathematics are often discovered.[17]
5. ALGORITHMS
Algorithmic information theory is a subfield of information theory and computer
science that concerns itself with the relationship between computation and
information. According to Gregory Chaitin, it is "the result of putting Shannon's
information theory and Turing's computability theory into a cocktail shaker and
shaking vigorously." l
Algorithmic information theory principally studies complexity measures on strings (or
other data structures). Because most mathematical objects can be described in
terms of strings, or as the limit of a sequence of strings, it can be used to study a
wide variety of mathematical objects, including integers and real numbers.
This use of the term "information" might be a bit misleading, as it depends upon the
concept of compressibility. Informally, from the point of view of algorithmic
information theory, the information content of a string is equivalent to the length of
the shortest possible self-contained representation of that string. A self-contained
representation is essentially a program – in some fixed but otherwise irrelevant
universal programming language – that, when run, outputs the original string.
Turing himself was fascinated with how the distinction between software and
hardware illuminated immortality and the soul. Identifying personal identity with
computer software ensured that humans were immortal, since even though
hardware could be destroyed, software resided in a realm of mathematical
abstraction and was thus immune to destruction.
6. An infinite binary sequence is said to be random if, for some constant c, for all
n, the Kolmogorov complexity of the initial segment of length n of the
sequence is at least n − c. It can be shown that almost every sequence (from
the point of view of the standard measure — "fair coin" or Lebesgue measure
— on the space of infinite binary sequences) is random. Also, since it can be
shown that the Kolmogorov complexity relative to two different universal
machines differs by at most a constant, the collection of random infinite
sequences does not depend on the choice of universal machine (in contrast
to finite strings). This definition of randomness is usually called Martin-Löf
randomness, after Per Martin-Löf, to distinguish it from other similar notions
of randomness. It is also sometimes called 1-randomness to distinguish it
from other stronger notions of randomness (2-randomness, 3-randomness,
etc.).
(Related definitions can be made for alphabets other than the set .)
Algorithm
An algorithm is any well-defined procedure for solving a given class of
problems. Ideally, when applied to a particular problem in that class, the
algorithm would yield a full solution. Nonetheless, it makes sense to speak of
algorithms that yield only partial solutions or yield solutions only some of the
time. Such algorithms are sometimes called "rules of thumb" or "heuristics."
Algorithms have been around throughout recorded history. The ancient
Hindus, Greeks, Babylonians, and Chinese all had algorithms for doing
arithmetic computations. The actual term algorithm derives from ninth-century
Arabic and incorporates the Greek word for number (arithmos ).
7. Algorithms are typically constructed on a case-by-case basis, being adapted to the
problem at hand. Nonetheless, the possibility of a universal algorithm that could in
principle resolve all problems has been a recurrent theme over the last millennium.
Spanish theologian Raymond Lully (c. 1232–1315), in his Ars Magna, proposed to
reduce all rational discussion to mechanical manipulations of symbolic notation and
combinatorial diagrams. German philosopher Gottfried Wilhelm Leibniz (1646–
1716) argued that Lully's project was overreaching but had merit when conceived
more narrowly.
The idea of a universal algorithm did not take hold, however, until technology had
advanced sufficiently to mechanize it. The Cambridge mathematician Charles
Babbage (1791–1871) conceived and designed the first machine that could in
principle resolve all well-defined arithmetic problems. Nevertheless, he was unable
to build a working prototype. Over a century later another Cambridge
mathematician, Alan Turing (1912–1954), laid the theoretical foundations for
effectively implementing a universal algorithm.
Turing proposed a very simple conceptual device involving a tape with a movable
reader that could mark and erase letters on the tape. Turing showed that all
algorithms could be mapped onto the tape (as data) and then run by a universal
algorithm already inscribed on the tape. This machine, known as a universal Turing
machine, became the basis for the modern theory of computation (known as
recursion theory) and inspired the modern digital computer.
Turing's universal algorithm fell short of Lully's vision of an algorithm that could
resolve all problems. Turing's universal algorithm is not so much a universal
problem solver as an empty box capable of housing and implementing the
algorithms placed into it. Thus Turing invited into the theory of computing the very
Cartesian distinction between hardware and software. Hardware is the mechanical
device (i.e., the empty box) that houses and implements software (i.e., the
algorithms) running on it.
8. It is a deep and much disputed question
whether the essence of what constitutes
the human person is at base
computational and therefore an emergent
property of algorithms, or whether it
fundamentally transcends the capacity of
algorithms.
In mathematics and computer science, an
algorithm
9. The logarithm of a number is the exponent by
which another fixed value, the base, has to be
raised to produce that number. For example, the
logarithm of 1000 to base 10 is 3, because 1000 is
10 to the power 3: 1000 = 103 = 10 × 10 × 10. More
generally, if x = by, then y is the logarithm of x to
base b, and is written y = logb(x), so log10(1000) =
3.
10. Logarithms were introduced by John Napier in the early
17th century as a means to simplify calculations. They were
rapidly adopted by scientists, engineers, and others to
perform computations more easily, using slide
rules and logarithm tables. These devices rely on the fact—
important in its own right—that the logarithm of
a product is the sum of the logarithms of the factors:
The present-day notion of logarithms comes from Leonhard
Euler, who connected them to the exponential function in
the 18th century.
The logarithm to base b = 10 is called the common
logarithm and has many applications in science and
engineering. The natural logarithmhas the constant e (≈
2.718) as its base; its use is widespread in pure mathematics,
especially calculus. The binary logarithm uses baseb = 2 and
is prominent in computer science.
11. Logarithmic scales reduce wide-ranging quantities to
smaller scopes. For example, the decibel is a logarithmic unit
quantifying sound pressure and voltage ratios. In
chemistry, pH and pOH are logarithmic measures for
the acidity of an aqueous solution. Logarithms are
commonplace in scientific formulae, and in measurements
of the complexity of algorithms and of geometric objects
called fractals. They describe musical intervals, appear in
formulae counting prime numbers, inform some models
in psychophysics, and can aid in forensic accounting.
In the same way as the logarithm reverses exponentiation,
the complex logarithm is the inverse function of the
exponential function applied to complex numbers.
The discrete logarithm is another variant; it has applications
in public-key cryptography.
12. ARETHMATIC PROGRESSION
In mathematics, an arithmetic progression (AP) or arithmetic
sequence is a sequence of numbers such that the difference between
the consecutive terms is constant. For instance, the sequence 3, 5, 7, 9,
11, 13, … is an arithmetic progression with common difference of 2.
If the initial term of an arithmetic progression is and the common
difference of successive members is d, then the nth term of the
sequence ( ) is given by:
A finite portion of an arithmetic progression is called a finite arithmetic
progression and sometimes just called an arithmetic progression.The
sum of a finite arithmetic progression is called an arithmetic series.
The behavior of the arithmetic progression depends on the common
difference d. If the common difference is:
13. Positive, the members (terms) will grow towards positive infinity.
Negative, the members (terms) will grow towards negative infinity.
Arithmetic
Arithmetic is a branch of mathematics concerned with the addition, subtraction,
multiplication, division, and extraction of roots of certain numbers known as real
numbers. Real numbers are numbers with which you are familiar in everyday life:
whole numbers, fractions, decimals, and roots, for example.
Early development of arithmetic
Arithmetic grew out of the need that people have for counting objects. For
example, Stone Age men or women probably needed to count the number of
children they had. Later, one person might want to know the number of oxen to
be given away in exchange for a wife or husband. For many centuries, however,
counting probably never went beyond the 10 stage, the number of fingers on
which one could note the number of objects.
14. Numbering system
The numbering system we use today is called the Hindu-Arabic
system. It was developed by the Hindu civilization of India about
1,500 years ago and then brought to Europe by the Arabs in the
MiddleAges (400–1450). During the seventeenth century, the
Hindu-Arabic system completely replaced the Roman numeral
system that had been in use earlier.
The Hindu-Arabic system is also called a decimal system because
it is based on the number 10.The ten symbols used in the decimal
system are 0, 1, 2, 3, 4, 5, 6, 7, 8, and 9. Other number systems
are possible and, in fact, are also used today. Computers, for
example, operate on a binary system that consists of only two
numbers, 0 and 1. Our system of time uses the sexagesimal
(pronounced sek-se-JES-em-el) system, consisting of the
numbers 0 to 60.
15. Words to Know
Associative law: An axiom that states that grouping numbers during
addition or multiplication does not change the final result.
Axiom: A basic statement of fact that is stipulated as true without being
subject to proof.
Closure property: An axiom that states that the result of the addition or
multiplication of two real numbers is a real number.
Commutative law: An axiom of addition and multiplication that states
that the order in which numbers are added or multiplied does not
change the final result.
Hindu-Arabic number system: A positional number system that uses
ten symbols to represent numbers and uses zero as a place holder. It is
the number system that we use today.
Inverse operation: A mathematical operation that reverses the work of
another operation; for example, subtraction is the inverse operation of
addition.
16. THANK YOU
• MADE BY :-
• NISARG – 6231
• CHAITYA- 6882
•ANURAG-5556
•IMRAN-7003