You are on page 1of 7

Statics

Example of a beam in static equilibrium. The sum of force and moment is zero. Statics is the branch of physics concerned with the analysis of loads (force, torque/moment) on physical systems in static equilibrium, that is, in a state where the relative positions of subsystems do not vary over time, or where components and structures are at rest under the action of external forces of equilibrium. In other words it is how forces are transmitted through the members in an object such as a crane from where it is applied on the object, the hanging end, to where it is supported from, the base of the crane. When in static equilibrium, the system is either at rest, or moving at constant velocity through its center of mass. By Newton's second law, this situation implies that the net force and net torque (also known as moment) on every body in the system is zero, meaning that for every force bearing upon a member, there must be an equal and opposite force. From this constraint, such quantities as stress or pressure can be derived. The net forces equalling zero is known as the first condition for equilibrium, and the net torque equalling zero is known as the second condition for equilibrium. See statically determinate. Statics is thoroughly used in the analysis of structures, for instance in architectural and structural engineering. Strength of materials is a related field of mechanics that relies heavily on the application of static equilibrium. Hydrostatics, also known as fluid statics, is the study of fluids at rest. This analyzes systems in static equilibrium which involve forces due to mechanical fluids. The characteristic of any fluid at rest is that the force exerted on any particle of the fluid is the same in every direction. If the force is unequal the fluid will move in the direction of the resulting force. This concept was first formulated in a slightly extended form by the French mathematician and philosopher Blaise

Pascal in 1647 and would be later known as Pascal's Law. This law has many important applications in hydraulics. Galileo also was a major figure in the development of hydrostatics. In economics, "static" analysis has substantially the same meaning as in physics. Since the time of Paul Samuelson's Foundations of Economic Analysis (1947), the focus has been on "comparative statics", i.e., the comparison of one static equilibrium to another, with little or no discussion of the process of going between them except to note the exogenous changes that caused the movement. In exploration geophysics, "statics" is used as a short form for "static correction", referring to bulk time shifts of a reflection seismogram to correct for the variations in elevation and velocity of the seismic pulse through the weathered and unconsolidated upper layers. Mathematics is the body of knowledge centered on such concepts as quantity, structure, space, and change, and also the academic discipline that studies them. Benjamin Peirce called it "the science that draws necessary conclusions". Other practitioners of mathematics maintain that mathematics is the science of pattern, and that mathematicians seek out patterns whether found in numbers, space, science, computers, imaginary abstractions, or elsewhere. Mathematicians explore such concepts, aiming to formulate new conjectures and establish their truth by rigorous deduction from appropriately chosen axioms and definitions. Through the use of abstraction and logical reasoning, mathematics evolved from counting, calculation, measurement, and the systematic study of the shapes and motions of physical objects. Knowledge and use of basic mathematics have always been an inherent and integral part of individual and group life. Refinements of the basic ideas are visible in mathematical texts originating in the ancient Egyptian, Mesopotamian, Indian, Chinese, Greek and Islamic worlds. Rigorous arguments first appeared in Greek mathematics, most notably in Euclid's Elements. The development continued in fitful bursts until the Renaissance period of the 16th century, when mathematical innovations interacted with new scientific discoveries, leading to an acceleration in research that continues to the present day. Today, mathematics is used throughout the world in many fields, including natural science, engineering, medicine, and the social sciences such as economics. Applied mathematics, the application of mathematics to such fields, inspires and makes use of new mathematical discoveries and sometimes leads to the development of entirely new disciplines. Mathematicians also engage in pure mathematics, or mathematics for its own sake, without having any application in mind, although applications for what began as pure mathematics are often discovered later.

Etymology
The word "mathematics" (Greek: or mathmatik) comes from the Greek (mthma), which means learning, study, science, and additionally came to have the narrower and more technical meaning "mathematical study", even in Classical times. Its adjective is

(mathmatiks), related to learning, or studious, which likewise further came to mean mathematical. In particular, (mathmatik tkhn), in Latin ars mathematica, meant the mathematical art. The apparent plural form in English, like the French plural form les mathmatiques (and the less commonly used singular derivative la mathmatique), goes back to the Latin neuter plural mathematica (Cicero), based on the Greek plural (ta mathmatik), used by Aristotle, and meaning roughly "all things mathematical". In English, however, the noun mathematics takes singular verb forms. It is often shortened to math in English-speaking North America and maths elsewhere.

History
The evolution of mathematics might be seen as an ever-increasing series of abstractions, or alternatively an expansion of subject matter. The first abstraction was probably that of numbers. The realization that two apples and two oranges have something in common was a breakthrough in human thought. In addition to recognizing how to count physical objects, prehistoric peoples also recognized how to count abstract quantities, like time days, seasons, years. Arithmetic (addition, subtraction, multiplication and division), naturally followed. Further steps need writing or some other system for recording numbers such as tallies or the knotted strings called quipu used by the Inca to store numerical data. Numeral systems have been many and diverse, with the first known written numerals created by Egyptians in Middle Kingdom texts such as the Rhind Mathematical Papyrus. The Indus Valley civilization developed the modern decimal system, including the concept of zero. From the beginnings of recorded history, the major disciplines within mathematics arose out of the need to do calculations relating to taxation and commerce, to understand the relationships among numbers, to measure land, and to predict astronomical events. These needs can be roughly related to the broad subdivision of mathematics into the studies of quantity, structure, space, and change. Mathematics has since been greatly extended, and there has been a fruitful interaction between mathematics and science, to the benefit of both. Mathematical discoveries have been made throughout history and continue to be made today. According to Mikhail B. Sevryuk, in the January 2006 issue of the Bulletin of the American Mathematical Society, "The number of papers and books included in the Mathematical Reviews database since 1940 (the first year of operation of MR) is now more than 1.9 million, and more than 75 thousand items are added to the database each year. The overwhelming majority of works in this ocean contain new mathematical theorems and their proofs."

Inspiration, pure and applied mathematics, and aesthetics

Mathematics arises wherever there are difficult problems that involve quantity, structure, space, or change. At first these were found in commerce, land measurement and later astronomy; nowadays, all sciences suggest problems studied by mathematicians, and many problems arise within mathematics itself. For example, Richard Feynman invented the Feynman path integral using a combination of mathematical reasoning and physical insight, and today's string theory continues to inspire new mathematics. Some mathematics is only relevant in the area that inspired it, and is applied to solve further problems in that area. But often mathematics inspired by one area proves useful in many areas, and joins the general stock of mathematical concepts. The remarkable fact that even the "purest" mathematics often turns out to have practical applications is what Eugene Wigner has called "the unreasonable effectiveness of mathematics." As in most areas of study, the explosion of knowledge in the scientific age has led to specialization in mathematics. One major distinction is between pure mathematics and applied mathematics. Several areas of applied mathematics have merged with related traditions outside of mathematics and become disciplines in their own right, including statistics, operations research, and computer science. For those who are mathematically inclined, there is often a definite aesthetic aspect to much of mathematics. Many mathematicians talk about the elegance of mathematics, its intrinsic aesthetics and inner beauty. Simplicity and generality are valued. There is beauty in a simple and elegant proof, such as Euclid's proof that there are infinitely many prime numbers, and in an elegant numerical method that speeds calculation, such as the fast Fourier transform. G. H. Hardy in A Mathematician's Apology expressed the belief that these aesthetic considerations are, in themselves, sufficient to justify the study of pure mathematics. Mathematicians often strive to find proofs of theorems that are particularly elegant, a quest Paul Erds often referred to as finding proofs from "The Book" in which God had written down his favorite proofs. The popularity of recreational mathematics is another sign of the pleasure many find in solving mathematical questions. Electrical engineering, sometimes referred to as electrical and electronic engineering, is a field of engineering that deals with the study and application of electricity, electronics and electromagnetism. The field first became an identifiable occupation in the late nineteenth century after commercialization of the electric telegraph and electrical power supply. It now covers a range of subtopics including power, electronics, control systems, signal processing and telecommunications. Electrical engineering may or may not encompass electronic engineering. Where a distinction is made, usually outside of the United States, electrical engineering is considered to deal with the problems associated with large-scale electrical systems such as power transmission and motor control, whereas electronic engineering deals with the study of small-scale electronic systems including computers and integrated circuits. Alternatively, electrical engineers are usually concerned with using electricity to transmit energy, while electronic engineers are concerned with using electricity to transmit information.

History
Electricity has been a subject of scientific interest since at least the early 17th century. The first electrical engineer was probably William Gilbert who designed the versorium: a device that detected the presence of statically charged objects. He was also the first to draw a clear distinction between magnetism and static electricity and is credited with establishing the term electricity. In 1775 Alessandro Volta's scientific experimentations devised the electrophorus, a device that produced a static electric charge, and by 1800 Volta developed the voltaic pile, a forerunner of the electric battery. However, it was not until the 19th century that research into the subject started to intensify. Notable developments in this century include the work of Georg Ohm, who in 1827 quantified the relationship between the electric current and potential difference in a conductor, Michael Faraday, the discoverer of electromagnetic induction in 1831, and James Clerk Maxwell, who in 1873 published a unified theory of electricity and magnetism in his treatise Electricity and Magnetism. During these years, the study of electricity was largely considered to be a subfield of physics. It was not until the late 19th century that universities started to offer degrees in electrical engineering. The Darmstadt University of Technology founded the first chair and the first faculty of electrical engineering worldwide in 1882. In 1883 Darmstadt University of Technology and Cornell University introduced the world's first courses of study in electrical engineering, and in 1885 the University College London founded the first chair of electrical engineering in the United Kingdom. The University of Missouri subsequently established the first department of electrical engineering in the United States in 1886. During this period, the work concerning electrical engineering increased dramatically. In 1882, Edison switched on the world's first large-scale electrical supply network that provided 110 volts direct current to fifty-nine customers in lower Manhattan. In 1887, Nikola Tesla filed a number of patents related to a competing form of power distribution known as alternating current. In the following years a bitter rivalry between Tesla and Edison, known as the "War of Currents", took place over the preferred method of distribution. AC eventually replaced DC for generation and power distribution, enormously extending the range and improving the safety and efficiency of power distribution. The efforts of the two did much to further electrical engineeringTesla's work on induction motors and polyphase systems influenced the field for years to come, while Edison's work on telegraphy and his development of the stock ticker proved lucrative for his company, which ultimately became General Electric. However, by the end of the 19th century, other key figures in the progress of electrical engineering were beginning to emerge.

Modern developments

Emergence of radio and electronics During the development of radio, many scientists and inventors contributed to radio technology and electronics. In his classic UHF experiments of 1888, Heinrich Hertz transmitted (via a spark-gap transmitter) and detected radio waves using electrical equipment. In 1895, Nikola Tesla was able to detect signals from the transmissions of his New York lab at West Point (a distance of 80.4 km / 49.95 miles). In 1897, Karl Ferdinand Braun introduced the cathode ray tube as part of an oscilloscope, a crucial enabling technology for electronic television. John Fleming invented the first radio tube, the diode, in 1904. Two years later, Robert von Lieben and Lee De Forest independently developed the amplifier tube, called the triode. In 1895, Guglielmo Marconi furthered the art of hertzian wireless methods. Early on, he sent wireless signals over a distance of one and a half miles. In December 1901, he sent wireless waves that were not affected by the curvature of the Earth. Marconi later transmitted the wireless signals across the Atlantic between Poldhu, Cornwall, and St. John's, Newfoundland, a distance of 2100 miles. In 1920 Albert Hull developed the magnetron which would eventually lead to the development of the microwave oven in 1946 by Percy Spencer. In 1934 the British military began to make strides towards radar (which also uses the magnetron) under the direction of Dr Wimperis, culminating in the operation of the first radar station at Bawdsey in August 1936. In 1941 Konrad Zuse presented the Z3, the world's first fully functional and programmable computer. In 1946 the ENIAC (Electronic Numerical Integrator and Computer) of John Presper Eckert and John Mauchly followed, beginning the computing era. The arithmetic performance of these machines allowed engineers to develop completely new technologies and achieve new objectives, including the Apollo missions and the NASA moon landing. The invention of the transistor in 1947 by William B. Shockley, John Bardeen and Walter Brattain opened the door for more compact devices and led to the development of the integrated circuit in 1958 by Jack Kilby and independently in 1959 by Robert Noyce. In 1968 Marcian Hoff invented the first microprocessor at Intel and thus ignited the development of the personal computer. The first realization of the microprocessor was the Intel 4004, a 4-bit processor developed in 1971, but only in 1973 did the Intel 8080, an 8-bit processor, make the building of the first personal computer, the Altair 8800, possible. Propulsion

Jet engine

Rocket

Spacecraft propulsion

Electric propulsion

You might also like