You are on page 1of 38

STEM

Volume 1

Science, Technology, Engineering, Mathematics

Bulletin
Number 1 January , 2012

Amalgamation of Science, Technology, Engineering and Mathematics

STEM
Vol. 1 No. 1A

Science, Technology, Engineering, Mathematics

Bulletin
Quarterly Bulletin of Jagan Nath University, Jaipur January 2012
Page No

Contents
Transparent Concrete: Now you can see through walls Mr. Bharat Nagar Rajiv Gandhi (Bandra-Worli) Sea Link Project Mr. Pramod Kumar

Visible Light Communication Advent of the New Era in Wireless Technology Ms. Meenu Dave Study of Charging and Discharging of a Condenser - A Microcomputer Based Laboratory Experiment Prof Y.S. Shishodia The Trends and Future of Biotechnology Crops for Disease Control Dr Vikas Bishnoi and Dr Preeti Sharma Applications of Physics in Science and Engineering Ms. Sushila Brief History of Special Functions and Their Applications in Science and Engineering Mr. Jagdev Singh IRIS Recognition Algorithms Mr. Gajendra Shrimal 3D Sensors to Allow Interact with Computers without Touch Ms.Kavita Choudhary Bioplastic: A Better Alternative For Sustainable Future Dr Preeti Sharma and Dr Vikas Bishnoi Benefit of FACTS Controllers in AC Transmission Systems Mr.Vishal Sharma Quality Management System Mr.Vipin Goyal JNU STEM Bulletin January 2012

10

12

14

17

19

23

26

27

31

34

Page 2

STEM NEWS
Faster Than Light
Nothing can travel faster than the speed of light. It is a concept that forms a cornerstone of our understanding of the universe and the concept of time . "The constancy of the speed of light essentially underpins our understanding of space and time and causality, which is the fact that cause comes before effect." But now it seems that researchers working at CERN, one of the world's largest physics laboratories, under a mountain in central Italy, have recorded neutrinos ( Neutrinos are mysterious particles. They have a minuscule mass, no electric charge, and pass through almost any material as though it was not there. ) travelling at a speed that is supposedly forbidden by Einstein's theory of special relativity. The researchers on the OPERA experiment have recorded that in the 730km journey, the light would take 2.4 milliseconds to travel, while in experiments lasting over three years, about 15,000 neutrinos have arrived about 60 billionths of a second earlier with an error margin of 10 billionths of a second. This implies that the neutrinos have travelled at 299,798,454 metres per second while the light speed is 299,792,458 metres per second. Scientists world over are waiting for independent verification of these results. A confirmation would necessitate a fundamental change in our understanding of space-time and fundamental interactions in nature.

3 D Transistor An Amazing Breakthrough


Intel Corporation has announced a significant breakthrough in the evolution of the transistor, the revolutionary 3-D transistor design called Tri-gate. The 3-D Tri-Gate transistors are a reinvention of the transistor. The traditional "flat" two-dimensional planar gate is replaced with an incredibly thin threedimensional silicon fin that rises up vertically from the silicon substrate. Control of current is accomplished by implementing a gate on each of the three sides of the fin two on each side and one across the top -- rather than just one on top, as is the case with the 2-D planar transistor. The additional control enables as much transistor current flowing as possible when the transistor is in the "on" state (for performance), and as close to zero as possible when it is in the "off" state (to minimize power), and enables the transistor to switch very quickly between the two states (again, for performance). An improvement of about 35% has been achieved in performance and power saving. Amazing world shaping devices will be created from these structures. Just as skyscrapers let urban planners optimize available space by building upward, Intel's 3-D TriGate transistor structure provides a way to manage density. Since these fins are vertical in nature, transistors can be packed closer together, a critical component to the technological and economic benefits. For future generations, designers also have the ability to continue growing the height of the fins to get even more performance and energy-efficiency gains.

JNU STEM Bulletin January 2012

Page 3

EDITORS Note
This Bulletin is a quarterly publication for dissemination of didactic information on Science, Technology. Engineering and Mathematics and related areas activities in Jagan Nath University and others.

Welcome to STEM! It is with a sense of trepidation that I am presenting the First issue of STEM before the readers. It is a very challenging task to decide on the format, contents, whether to present research and development work or bring out the latest innovations, developments in STEM areas or the creative work of the students through Project work, competitions , the on-going R&D works , Seminars, FDP programs and conferences participation of the staff members. The dissemination of information about achievements , publications and other activities in the Jagan Nath University also needs to be brought out. The Stem, Tell Us What You Think: heralded as the valor of a plant, is inducted analogously in the acronym STEM , to serve up as a potent weapon of Science, Technology, Please send your comments, Engineering and Mathematics. observations and suggestions This Bulletin will evolve, as changes are taking place constantly in the to Editor by email at world around us, in all fields more so in STEM fields. Newer pvc@jagannathuniversity.org developments such as iPhones, iPads, Cloud computing, nanotechnology and structures , optical computing and communications, mechatronics, eBooks, Open Coursewares are changing in a DISCLAIMER: fundamental manner how we teach, what we teach and where we teach. We will endeavor to bring all that is relevant to human beings and the Any views or opinions presented in this Bulletin are environment in the STEM Bulletins. There is considerable concern about the way the teaching of STEM solely those of the author and Jagan Nath University is subjects is done. The continual concern of the industry as well as academia about the lack of suitability of the scientists , engineers, in no way responsible for technologists and mathematicians coming out the educational systems any infringement of needs to be addressed. The increasing role of the modern day tools such Copyright or IPR. as internet, smart-phones, iPads and Tablet PCs as well as simulation and modeling also needs to be utilized in communicating with the students. Readers are free to use the I will appreciate communications from you regarding the contents of the material included in this Bulletin and how you feel about them. These will be of immense help in Bulletin. However they are expected to acknowledge it improving the quality and I will be grateful personally for critical comments . These will go a long way in improving the STEM Bulletin. and inform the Editor. Prof. Y S SHISHODIA

JNU STEM Bulletin January 2012

Page 4

JAGAN NATH UNIVERSITY INITIATIVES:

Establishment of CIRDEST

Centre for Innovation, Research and Development in Engineering, Science and Technology The need for innovation and Research and Development in Engineering, Science and Technology has been long felt. The university also need to embark on inculcating, encouraging and implementing innovative ideas in teaching, research and development With above objectives in view, the university has established Centre for Innovation, Research and Development in Engineering, Science and Technology (CIRDEST) CIRDEST will encourage, stimulate and support amongst others , the following by the university fraternity: Research in broad areas of Computer Science, Information Technology, Electronics and Communication, Mechanical Engineering, Civil Engineering, Electrical Engineering, Environmental Engineering, Bio-Technology, Physical and Mathematical Sciences, Development of Innovative experiments in branches of engineering and physical sciences for better understanding of physical phenomenon and design principles, Development of simulation experiments , Innovative Student Projects to develop hands-on technical skills and products which may lead to start up companies, Interaction with other academic and research institutions, industry and entrepreneurs. Interaction with various funding agencies for support of R&D activities.

Establishment of

Entrepreneurship Development Cell (EDC)

Keeping in view the belief that original and innovative ideas , having potential to change societal interactions, educational and scientific technologies, industry by creating value are formed in young minds, and the need of molding these ideas into successful ventures, learn from successful and failed efforts to generate new ideas , tapping of entrepreneurial talent of such young minds and cultivating and encouraging the Entrepreneur culture among the youth, the University has established ENTREPREUNERSHIP DEVELOPMENT CELL . The OBJECTIVES of the EDC amongst others are : 1. 2. 3. 4. 5. 6. To create entrepreneurship awareness amongst students and faculty. To create self-employment awareness To educate youngsters to take benefit of government policies in establishing their own ventures. To conduct vocational training/skill development programs. To organize meetings with funding agencies. To provide interaction with successful entrepreneurs Page 5

JNU STEM Bulletin January 2012

Transparent Concrete: Now you can see through walls


Bharat Nagar
Department of Civil Engineering, Jagan Nath University, Jaipur

Remember the dear little ghost, Casper, who could walk through walls? Now you can too have close to similar powers! Yes, although you will have to wait a while before you can walk through walls, for the time being, you can see through walls. I am introducing the new technology in civil engineering i.e. Transparent Concrete or we can also say light transmitting concrete. Transparency is the new buzz word that attracts the attention of others whether it's the case of humanities/social context that implies openness in communication/statement or in the case of objects like glass building that is also transparent and can be seen through. Seeing the increasing trend of transparency in every sphere especially in building material sector Hungarian Architect Aron Losonczi thinks, is it possible to make concrete structure transparent? When imaginations run wild they invent ways from their own. Architect Aron Losonczi also got a result for his quest and in 2001 he had invented concrete that is transparent. We all know that concrete can be concocted to look like many things, but who would have thought that the rock-solid substance could be a substitute for a window or as a partition wall. The phenomenon of light transmitting concrete in the form of a widely applicable new building material. It is a combination of optical fibers and fine concrete and can be produced as prefabricated building blocks and panels. Due to the small size of the fibers, they blend into concrete becoming a component of the material like small pieces of aggregate. In this manner, the result is not only two materials - glass in concrete - mixed, but a third, new material, which is homogeneous in its inner structure and on its main surfaces as well. The fibers lead light by points between the two sides of the blocks. Because of their parallel position, the light-information on the brighter side of such a wall appears unchanged on the darker side. The most interesting form of this phenomenon is probably the sharp display of shadows on the opposing side of the wall. Moreover, the color of the light also remains the same. Thousands of optical fibers are organized into thin layers and run parallel to each other between the two main surfaces of each block. The proportion of the fibers is very small (4%) compared to the total volume of the blocks. Moreover, these fibers mingle in the concrete because of their small size, and they become a structural component as a kind of modest aggregate. Therefore, the surface of the blocks remains homogeneous concrete. A wall structure built from light-transmitting concrete can be several meters thick, because the fibers work without almost any loss in light up until 20 meters. Loadbearing structures can also be built of these blocks, since the fibers do not have a negative effect on the high compressive strength value of concrete. The blocks can be produced in various colors and sizes.

JNU STEM Bulletin January 2012

Page 6

Transparent Concrete is the first step to what might become the building material of the future.

Technical data:

product Form Ingredients Density Maximum block size Standard block size Thickness Colour Finish Compressive strength Bending tensile strength

Transparent Concrete Prefabricated blocks 96% concrete, 4% optical fibre 2100-2400kg/m 600*300 mm 600*300mm 25-500mm Grey,black or white Polished 70N/mm 45 N/mm

JNU STEM Bulletin January 2012

Page 7

Rajiv Gandhi (Bandra-Worli) Sea Link Project


Pramod Kumar Department of Physics, JAgan NAth University, JAIPUR

Rajiv Gandhi (Bandra-Worli) Sea Link Project has been one of the most highly recommended projects of all the transport studies done for the metropolitan region during the last forty years. This is first sea-bridge that will cut travel time between Bandra and Worli. At present, Mahim causeway is the only link connecting western suburbs to island city of Mumbai. The existing north south western corridor is highly congested and during the peak hours results in a bottleneck at Mahim Causeway. The bridge, conceived in the late-1990s, has used steel wires equivalent to the circumference of the earth. The height of the cable-stayed bridge is equal to a 43-storied building. The project is part of the western freeway sea project to connect Borivalli and distant western suburbs to Nariman Point in south Mumbai. The eight-lane bridge would help considerably reduce congestion in the Mahim Causeway area, presently the only connection between south Mumbai and the central and western suburbs

The 5.6 km state-of-the art cable-stayed bridge providing an alternative route to south Mumbai from the western suburbs became operational ahead of the Maharashtra Assembly elections due this year-end. More than 10 years in the making, the landmark bridge that will reduce traffic snarls will provide a free way through the sea, reducing travel time between the two points from the present 60-90-minutes to 6-8minutes. Sonia Gandhi inaugurated this bridge and dedicated the sea-link to the nation after she said the UPA government aims to develop Mumbai as the best city in the world. She said that "Our objective is to include Mumbai as the most elegant city in the world complete with all amenities," she said assuring Centre's complete support to eliminate all problems faced by the city. The Mahim Causeway has increasingly become bottleneck-prone as it clocks around 1.25-lakh vehicles daily. It takes nearly one hour to travel the 8-kilometre distance from Mahim to Worli presently. The project commissioned by the Maharashtra State Road development Transport Corporation (MSRDC) and Maharashtra Government has been built by Hindustan Construction Company (HCC). The main span of JNU STEM Bulletin January 2012 Page 8

the cable-stayed portion of the Bandra-Worli Sea Link measuring 500 meters is the largest in India, superseding Vidyasagar Setu in Kolkata and shares the 20th spot with Thailand's Kanchanaphisek Bridge among bridges with the longest span in the world. Project Location

Express Highway and Swami Vivekanand road at Bandra and connects to Worli at Worli end with overall length of 5.6 kms for the entire project. A cloverleaf interchange at Mahim intersection and a flyover at the Lovegrove intersection have been proposed as part of this project to enhance the faster and safe traffic dispersal.

JNU STEM Bulletin January 2012

Page 9

Visible Light Communication Advent of the New Era in Wireless Technology


Meenu Dave Department of Computer Science / Information Technology Jagan Nath University, Jaipur The speed with which the technological advancements are taking place, the world is turning into a small village in terms of fast, fast, and more fast access to any kind of information required. Thanks to the miraculous inventions, we have swiftly graduated from Desktops to Laptops, from Laptops to Netbooks, and from Netbooks to Smart Phones and Tablets for communication. Simultaneously, the shift from nonportable to portable and from wired to wireless technique has been gaining popularity. The next move in the line has been the increasing demand for Cloud Computing and Wi-Fi hotspots. Although the hardware is quite pleasing, but the wireless technique is facing impediments related to fast access (speed), and security. Installation difficulties in different environments add up to the existing obstacles. As a possible solution to the mentioned problems, scientists from the Fraunhofer Institute for Telecommunications and Heinrich Hertz Institute (HHI) in Berlin, Germany, have managed to develop a new transfer technology, namely the Visible Light Communication (VLC) technique to transfer data securely and at super high speed. Visible Light Communication uses light emitting diodes (LEDs), and performs two functionalities - of illumination and data transmission as well. It makes use of the visible light spectrum, which is free and less crowded compared to the other frequencies. The visible light communication (VLC) with minor technical

adjustments/modifications can turn regular LEDs into optical WLAN (Wireless Local Area Network). Quality, quickness, and safety are the buzzwords of this technology. By using LED light, data including HD quality video and audio, internet traffic, etc., can be transmitted at high speeds to wireless devices like smart phones, palm devices, etc.

JNU STEM Bulletin January 2012

Page 10

White-light LEDs, the sources of light for VLC, carry out the function of illumination as well as the data transfer. Having a modulator as an additional component, which aids in turning off the LEDs on and off in quick succession, data is thus transfered in a series of ones and zeroes. To the human eye, the modulation of the light is indiscernible. The initial experimental data transfer rate has been between 100-800 megabits per second. As compared to the radio frequency wireless communication, VLC offers many benefits. It is an unlicensed spectrum, with free and unregulated wavelengths. The spectrum is six orders of magnitude larger than radio frequency. It has high gain antenna, high quality links, and short wavelength which is harmless to humans. It is even safe for the human eyes. An additional advantage of the technology is that data transfer in this form cannot be intercepted. A photodetector is positioned directly within the light cone to receive the data. Thus the data transported in the light beam cannot be tapped. The technology seems to have only one disadvantage in case an obstacle is placed between the light and the photo diode, the transfer is blocked. Using visible light communication, a lot of fields can be benefited. Places like hospitals where radio frequencies are not allowed (because it would interfere with other equipments), such an optical system can work effectively transmitting large amounts of data. In transportation, data could be transmitted to vehicles like cars, trucks and trains through LED stoplights and railroad signals. Additional bandwidths can be provided for congested WiFi spectrum. Along with this, it can be highly beneficial in secured data transmission in any kind of industry, where usage of WiFi could pose a risk for corporate and organizational security. It can help enable smart domestic network with effective wireless communication including media streaming and internet access. In aviation, manufacturing plants, defence and military applications, oil/gas/petrochemical exploration, mining, and for navigation and tracking on land, inside buildings, or under water, it seems to be the best, cheapest, and most secure lighting and data transmission mode. Though it is still undergoing rigorous research, most of the groundwork for the commercialization has been carried out. A communications standard for VLC (IEEE 802.15.7) is already developed. The future seems to be very bright and promising, and one can expect to see more astounding results by the time this technology is out for commercial use.

JNU STEM Bulletin January 2012

Page 11

STUDY of CHARGING and DISCHARGING of a CONDENSER - A Microcomputer Based Laboratory EXPERIMENT Prof Y.S. Shishodia Jagan Nath University, Jaipur

The charge q on a condensers plate is proportional to the potential difference across its plates. This relationship is expressed as V= q/ C , Where C is a proportionality constant known as the capacitance. C is measured in unit of farad, F, ( farad = 1 coulomb/volt) When a voltage source is connected to a condenser through a resistance, charge flows from the source to the condenser, it continues to flow till the voltage across the condenser becomes equal to the source voltage, when the flow of charge stops and the condenser is fully charged. Similarly when the source of voltage is removed and the condenser is allowed to discharge through a resistance, the condenser voltage gradually drops to zero and the condenser is fully discharged. The time dependence of voltage across the condenser when charging through a resistance R with a voltage source V0 is given by Vc ( t ) = V0 ( 1 - exp ( -t/ RC ) ) While during discharge through a resistor R , the time dependence of voltage across condenser, when initially charged to a voltage V0 is given by Vc (t) = V0 ( exp ( -t/RC)) The rate of charging and also discharging is determined by the product RC, known as the time constant of the circuit. A large time constant means that the condenser will discharge slowly. The charging and discharging of a condenser is a familiar experiment in undergraduate physics and electronics laboratories. However the actual data taking process is often cumbersome and time consuming. If the RC time constant is 1 second and the source voltage is 5 Volt, then the condenser will charge to 3.16 Volt in 1 second, to 4.32 Volt in next second and to 4.75 Volt in next second. Thus in 3seconds the voltage would have reached 95% of the peak value. Similarly during discharge, the voltage would have fallen to 0.25 V in 3 seconds. Manual data taking would require one person to note down the time and another one to record Voltage simultaneously If the data are to be taken manually, one needs to have fairly large values of resistor and condenser. In the present MBL experiment, a regulated 5 Volt power supply is used for charging a 50 microfarad condenser through a series resistance of 20 kilo-ohm. Two switches are used , S1 for initiating the charging and S 2 for discharging . The time constant of the circuit is 1 second. A Vernier voltage probe is connected across the condenser to measure its voltage . The probe signal through a Vernier GoLink is fed to the USB port of a computer working under WindowsXP operating system. The data acquisition is controlled by the Vernier LogerPro software. The software provides a real time display of the Time- Voltage data in tabular as well as graphical form on the computer screen. The sampling rate ( adjustable from 100,000 samples per second to several hours) chosen was 10 per second. The condenser is shorted to remove any residual charge on it and the data acquisition is started. After a few seconds the condenser charging is started by closing switch S 1. The screen displays the voltage across the condenser on Y axis and time on X axis. Also displayed are the numeric values of time and V c (t) in tabular form. Once the condenser voltage reaches the peak value, the switch S 1 is opened and S2 is closed to discharge the condenser through the resistance. Thus the charging and discharging can be recorded and displayed on one screen. One typical run is shown below..

JNU STEM Bulletin January 2012

Page 12

The software has also several data analysis features. One can move cursor on the screen to read time and voltage values. It has also features of curve fitting. The charging data have been fitted to curve A*( 1- exp(-t/RC)) +B and the discharge curve has been fitted to curve A * (exp9-t/RC)) + B. The time constant RC during charging is 1.00 second and during discharge it is 0.99 second. The solid curves are the line fits and the overlap with experimental data points is very good. The experiment can be modified to connect two condensers in series or in parallel , measure the time constants and verify the law of condensers in series and parallel. One can also record the voltage across the resistance to determine the beahviour of circuit current during charging and discharging. The students can be given the data to plot manually the graphs of charging and discharging and can be made to practice, drawing of graphs, linearization of data and compute the time constants.

JNU STEM Bulletin January 2012

Page 13

The trends and future of biotechnology crops for disease control


Dr Vikas Bishnoi and Dr Preeti Sharma Department of Life and Allied Science Jagan Nath University, Jaipur.

INTRODUCTION
There is a need to increase food production considerably in the foreseeable future to meet the food and feed demands of the world, requiring higher production, particularly in developing countries in Asia, Africa and Latin America. This demand has to be met primarily through yield increases on existing cultivated lands in order to be environmentally sustainable and cost effective. One way to increase yields is to minimize losses due to pests, which destroy on average 14 to 25% of the total global agricultural production. These losses are most significant in food crops since crop protection is less efficient in food crops than cash crops. The costs of pesticides, estimated at more than US $10 billion per annum, need to be added to these figures, and the fact that pesticides often affect non-target organisms and leave harmful residues should also be considered.

GLOBAL STATUS OF BIOTECH CROPS


Biotech crops have been grown commercially since 1996. In 2009, global production reached 134 million (M) ha in 25 countries. The nine industrialized countries contributing to this figure, still cultivated a larger area of GM crops than the 16 developing countries, but the gap was closing as more developing country farmers experienced the benefits of planting biotech crops first hand, thereby enabling this to become the fastest adoption of any crop technology in recent years, with a growth rate of approximately 8% per annum. These high adoption levels are due to the economic and environmental benefits experienced by farmers in both industrial and developing countries, most of them growing Bacillus thringiensis (Bt) cotton, followed by Bt maize.

ADOPTION OF Bt CROPS IN DEVELOPING COUNTRIES


In India, Bt cotton was first planted in 2002 by 54,000 farmers on 50,000 ha. By 2009, 5.6 M small- and resource-poor farmers were cultivating it on 8.4 M ha of which 90% of the farmers had replanted the crop and this represents 87% of all the cotton planted in the country.

BENEFITS FROM Bt CROPS

JNU STEM Bulletin January 2012

Page 14

To date, a large collection of more than 200 Bt proteins showing differing levels of toxicity to selected insects have been identified in various strains of the bacterium B. thringiensis. Bt proteins have been used as a safe but expensive biopesticide for over 40 years. They are non-toxic to vertebrates unlike synthetic pesticides, and are very specific to particular insect pests. This is also the case with Bt transgenic crops. In contrast to Bt technologies, synthetic pesticides often kill non-target pests and their predators, in addition to the target pest. Bt crops are also particularly suitable for small-scale farmers since no equipment and pesticide knowledge are needed for cultivation and these crops reduce exposure of farmers to insecticides, especially for those using hand sprayers. In this context, the cultivation of Bt maize has reduced yield losses due to root worms and stem borers substantially without resorting to the more toxic organophosphate insecticides. For maize, it is estimated that Bt varieties can substitute 40-50% of the insecticides currently in use. For cotton, conventional varieties require 2-30 sprays per season, which are drastically reduced with Bt varieties. This benefits both the environment and labourers health, especially in developing countries where pesticides are mainly applied with knapsack sprayers, like in China. Another benefit of Bt maize is that it accumulates less mycotoxins from opportunistic fungi that infect damaged kernels. Healthier cobs without insect damage are less likely to be infected by fungi, which produce mycotoxins that are harmful, and often lethal, to humans and livestock. Bt crops also increase incomes through higher yields of healthier grain which is emphasized by the continued increasing adoption. This holds true for both small holder farms and large farms. In addition, there has been no documented proof of any negative impact on non-target insects in Bt fields.

CURRENT STATUS OF Bt TECHNOLOGY


B. thuringiensis (Bt ) is a soil bacterium that produces a diverse group of insecticidal protein toxins with narrow specificity towards different insects. These toxins, called Crystal (Cry) and Cytolitic (Cyt) proteins are accumulated in crystalline inclusion bodies in the bacteria. Another class of toxins from these bacteria is expressed during bacterial growth and is known as vegetative insecticidal proteins. Cry proteins are pro-toxins that are activated by host proteases in the insect gut. They have been extensively studied and consist of three domains. Of these, domains II and III determine the insect specificity and interact with specific receptors located on the insect mid-gut surface that leads to oligomerization of the toxin molecules into a pre-pore structure that can insert into the host membrane. Domain I then facilitates insertion into the target membrane to form a transmembrane pore. Once the pore has formed, ionic leakage destroys the cells and JNU STEM Bulletin January 2012 Page 15

kills the insect. Generally, the toxin needs to be expressed at concentrations of more than 0.2% of total soluble protein in the appropriate tissue in a transgenic plant to be effective. The first generation of insect resistant crops that were commercialized expressed single Bt Cry genes, which poses a relatively high risk that insects will evolve resistance to the toxin. In the second and third generations, scientists have mitigated this risk through stacking or pyramiding different genes such as multiple but different Cry genes and Cry genes combined with other insecticidal proteins, which target different receptors in insect pests but also provide resistance to a wider range of pests. Alternatively, synthetic variants of Cry genes has been employed as in the case of MON863 which expresses a synthetic Bt kumamotoensis Cry3Bb1 gene against corn rootworm, which is eight times more effective than the native, non-modified version. Therefore multiple mutations/adaptations need to be made by target pests in order to develop resistance to this robust new generation of insect resistant crops.

FUTURE OF GM PEST CONTROL


Constitutive expression of Bt genes has been very successful, but in some cases tissue specific expression is a better option, for example in epidermal cells, which first come under attack from insects or in the phloem for sap sucking insects. It has been reported that expression can be regulated using transcription factors or chemical induction and with this technique, it is possible to create within plant refuges where parts of the plants do not express the genes and act as non-GM refuge. Plastid expression, such as in chloroplasts, is also an important target for future Bt crops. Higher levels of toxin, up to 3-5% of total leaf protein, are accumulated in chloroplasts since the plastid genome is bacterial in origin as are Bt genes. Since cytoplasmic plastids are predominantly maternally inherited, it will reduce the chances for gene flow through pollen.

JNU STEM Bulletin January 2012

Page 16

APPLICATIONS OF PHYSICS IN SCIENCE AND ENGINEERING

Sushila Department of Physics, Jagan Nath University,


Unlike traditional engineering disciplines, engineering science/physics is not necessarily confined to a particular branch of science or physics. Instead, engineering science/physics is meant to provide a more thorough grounding in applied physics for a selected specialty such as optics, quantum physics, material science, applied mechanics, nanotechnology, microfabrication, mechanical engineering, electrical engineering, biophysics , control theory, aerodynamics, energy, solid-state physics, etc. It is the discipline devoted to creating and optimizing engineering solutions through enhanced understanding and integrated application of mathematical, scientific, statistical, and engineering principles. The discipline is also meant for cross-functionality and bridges the gap between theoretical science and practical engineering with emphasis in research and development, design, and analysis. Engineering physics or engineering science degrees are respected academic degrees awarded in many countries. It is notable that in many languages the term for "engineering physics" would be directly translated into English as "technical physics". In some countries, both what would be translated as "engineering physics" and what would be translated as "technical physics" are disciplines leading to academic degrees, with the former specializes in nuclear power research, and the latter closer to engineering physics. In some institutions, engineering (or applied) physics major is a discipline or specialization within the scope of engineering science, or applied science. In many universities, engineering science programs may be offered at the levels of B.Tech, B.Sc., M.Sc. and Ph.D. Usually, a core of basic and advanced courses inmathematics, physics, chemistry, and biology forms the foundation of thecurriculum, while typical
JNU STEM Bulletin January 2012 Page 17

elective areas may include fluid dynamics, quantum physics, economics, plasma physics, relativity, solid mechanics, operations research, information technology and engineering, dynamical systems, bioengineering, environmental engineering, computational

engineering, engineering mathematics and statistics, solid-state devices, materials science, electromagnetism, nanoscience, nanotechnology, energy, and optics. While typical undergraduate engineering programs generally focus on the application of established methods to the design and analysis of engineering solutions, undergraduate program in engineering science focuses on the creation and use of more advanced experimental or computational techniques where standard approaches are inadequate (i.e., development of engineering solutions to contemporary problems in the physical and life sciences by applying fundamental principles). Due to rigorous nature of the academic curriculum, an undergraduate major in engineering science is an honors program at some universities such as the University of Toronto and Pennsylvania State University.

JNU STEM Bulletin January 2012

Page 18

BRIEF HISTORY OF SPECIAL FUNCTIONS AND THEIR APPLICATIONS IN SCIENCE AND ENGINEERING Jagdev Singh Department of Mathematics, Jagan Nath University,

Special functions have been around for centuries. No one can imagine Mathematics without Gaussian and confluent hypergeometric function, associated Legendre and Laguerre polynomials, Bessel functions and many more. The advents of fast computing machines was thought to have made special functions a subject of the past but continued development of older functions and introduction of new special functions made the subject important for future development. Because of their importance, several books and a large collection of papers are devoted to these functions. On several occasions, the solution of enumeration problems involving combinatorial objects requires knowledge from special function theory. Earlier the emphasis was on special functions satisfying linear differential equations, but this has now been extended to difference equations, partial differential equations and non linear differential equations. Special functions originated from England in the seventeenth century. The Oxford professor John Wallis, in 1656, developed some theory of the Gamma-function, produced a formula for and introduced elliptic integrals. In 1703, James Bernoulli solved a differential equation by an infinite series which is now called the series representation of Bessel-functions. In the nineteenth century, special functions were widely developed in Germany and France, but the main achievement of the nineteenth century was the introduction of the hypergeometric series by Gauss in 1812. Then Clausen in 1828 and Kumar in 1836 studied the two series
3 2

F and 1 F1 respectively. In

1880, Appell introduced hypergeometric functions of two variables and Lauricella generalized them to several variables in 1893. JNU STEM Bulletin January 2012 Page 19

In the twentieth century, special functions were used in the eigen value problems of quantum mechanics. In 1907 Barnes used the Gamma function to develop a new theory of Gauss hypergeometric function
2 1

F . Then various generalizations of

2 1

were introduced by Horn, Kamp de Friet,

MacRobert and Meijer. In 1903, the Swedish mathematician Gosta Mittag-Leffler introduced the MittagLeffler function. The importance of Mittag-Leffler function is realized during the last two decades due to its involvement in the problems of physics, chemistry, biology, engineering and applied sciences. The Mittag-Leffler function arises naturally in the solution of fractional order integral equations or fractional order differential equations, and especially in the investigations of the fractional generalization of the kinetic equation, random walks, Lvy flights, super-diffusive transport and in the study of complex systems. In 1961, Charles Fox introduced a more general function which is well-known in the literature as Fox H-function or simply the H-function. Furthermore, in 1987, Inayat-Hussain has introduced H function which is a new generalization of the familiar H-function of Fox. The most well known application areas of special functions are in physics, engineering, chemistry, computer science and statistics. Special functions are used in almost all areas of statistics. Statistical densities are basically elementary special functions or product of such functions. Hence, the theory of special functions is directly applicable to statistical distribution theory. While studying generalized densities, structural properties of densities, Bayesian inference, distributions of test statistics, characterization of densities and related studies of probability theory, stochastic processes and time series problems, and special functions and generalized special functions in the categories of Meijers Gfunctions and H-functions come in naturally. A number of statistical distributions involving special functions have been studied from time to time by several research workers. In this context a few to mention are Mckay distribution involving K function, Bessel function distribution involving I function. JNU STEM Bulletin January 2012 Page 20

The statistical distributions involving Bessel functions, Meijers G-functions and H-function have been widely studied due to their enormous practical applications in different fields. In recent years the access of this useful branch to scientists and engineers has been made possible by the interesting research work done by a number of persons. In 2010, Chaurasia and Singh obtained the distribution of mixed sum of two independent random variables with different probability density functions. One with probability density function defined in finite range and the other with probability density function defined in infinite range and associated with product of general class of polynomials and H-function. The method used is based on Laplace transform and its inverse. The distribution of sum of random variables is of great importance in many areas of physics and engineering. For example, sums of independent gamma random variables

have application in problems of queuing theory such as determination of total waiting time, in civil engineering such as determination of the total excess water flow in a dam. They also appear in obtaining the inter arrival time of drought events which is the sum of the drought duration and the successive non drought duration. The distribution of quotient of I-function random variables is of interest in many areas of physics and engineering. Wireless communication systems are the major modern communications that impact the human lifestyle in the last several years. Now due to the convergence of technology, many diverted technologies such as radio, camcorder, digital camera or even television are combined with wireless equipment. The frequency and data management are decisively concerned. Therefore, the quality and capacity of the channels are the most significantly predicament. Most coded and uncoded digital

communication systems are analyzed and designed under the ideal free-space propagation Gaussian noise assumption. For wireless communication, however, multipath fading, scattering and shadowing can significantly reduce the performance of the system compared to the ideal Gaussian assumption. Under various outdoor/indoor narrowband flat fading scenarios, a variety of statistical model of envelope JNU STEM Bulletin January 2012 Page 21

distributions, such as Rayleigh, Rician, exponential, Nakagami -m, Weibull, lognormal K etc. have been proposed by various authors. However, almost all these distributions have been proposed purely from empirical fitting of measured data to a statistical distribution with neither analytical nor physical justification. Recently, Yao et al. gave a systematic and unified approach to show that sphericallyinvariant random process can be used to model fading channel statistics. In 2010, Chaurasia and Kumar obtained the distribution of quotient of two independent I-function random variables and showed that the distribution of I-function random variables can be used in wireless communication fading statistics based on spherically-invariant random processes.

JNU STEM Bulletin January 2012

Page 22

IRIS Recognition Algorithms


Gajendra Shrimal
Department of Computer Science / Information Technology Jagan Nath University, Jaipur

A biometric system provides automatic recognition of an individual based on some sort of unique feature or characteristic possessed by the individual. Biometric systems have been developed based on fingerprints, facial features, voice, hand geometry, handwriting, the retina and the IRIS.

Biometric identification is an emerging technology which gains more attention in recent years. It employs physiological or behavioural characteristics to identify an individual. The physiological characteristics are iris, fingerprint, face and hand geometry. Voice, signature and keystroke dynamics are classified as behavioural characteristics. The three main stages of an iris recognition system are image pre-processing, feature extraction and template matching. The iris image needs to be pre-processed to obtain useful iris region. Image pre-processing is divided into three steps: iris localization, iris normalization and image enhancement. The iris is an externally visible, yet protected organ whose unique epigenetic pattern remains stable throughout adult life. These characteristics make it very attractive for use as a biometric for identifying individuals. The structure of the iris is unique to an individual and is stable with age come from two main sources. The first source of evidence is clinical observations. During the course of examining large numbers of eyes, ophthalmologists and anatomists have noted that the detailed pattern of an iris, even the left and right iris of a single person, seems to be highly distinctive. Further, in cases with repeated observations, the patterns seem to vary little, at least past childhood. The second source of evidence is developmental biology. There, one finds that while the general structure of the iris is genetically determined, the particulars of its minutiae are critically dependent on different circumstances e.g., the initial conditions in the embryonic precursor to the iris. Therefore, they are highly unlikely to be replicated via the natural course of events. Rarely, the developmental process goes awry, yielding only a rudimentary iris or a marked displacement or shape distortion of the pupil. Developmental evidence also bears on issues of stability with age. The particular significance for the purposes of recognition is the fact that pigmentation patterning continues until adolescence. Also, the average pupil size (for an individual) increases slightly until adolescence. Following adolescence, the healthy iris varies little for the rest of a persons life, although slight depigmentation and shrinking of the average papillary opening are standard with advanced age. Various diseases of the eye can drastically alter

JNU STEM Bulletin January 2012

Page 23

the appearance of the iris. It also appears that intensive exposure to certain environmental contaminants e.g., metals can alter iris pigmentation. However, these conditions are rare. On the whole, these lines of evidence suggest that the iris is highly distinctive and stable. From one year of age until death, the patterns of the iris are relatively constant over a persons lifetime. Because of this uniqueness and stability, iris recognition is a reliable human identification technique. Issues in the design and implementation of a system for automated iris recognition can be subdivided into four parts as shown in figure. The first stage is image acquisition. The second stage is concerned with localizing the iris per se from a captured image. The third stage is concerned with normalizing the localized iris from a captured image. The forth part is concerned with matching an extracted iris pattern with candidate data base entries.
Iris Image Capture

Image Pre-processing

Feature Extraction

Template Matching

Authentic/ Imposter

Schematic diagram of iris recognition Advantages - The IRIS of the eye has been described as the ideal part of the human body for biometric identification for several reasons: 1. It is an internal organ that is well protected against damage and wear by a highly transparent and sensitive membrane (the cornea). This distinguishes it from fingerprints, which can be difficult to recognize after years of certain types of manual labour. 2. Iris patterns possess a high degree of randomness variability: 244 degrees-of-freedom entropy: 3.2 bits per square-millimeters uniqueness: set by combinatorial complexity 3. An iris scan is similar to taking a photograph and can be performed from about 10 cm to a few meters away. There is no need for the person to be identified to touch any equipment that has recently been touched by a stranger, thereby eliminating an objection that has been raised in some cultures against finger-print scanners, where a finger has to touch a surface, or retinal scanning, where the eye can be brought very close to a lens (like looking into a microscope lens). 4. Limited genetic penetrance of iris patterns makes iris unique from other biometrics. 5. Some argue that a focused digital photograph with an iris diameter of about 200 pixels contains much more long-term stable information than a fingerprint. 6. While there are some medical and surgical procedures that can affect the colour and overall shape of the iris, the fine texture remains remarkably stable over many decades. Some iris identifications have succeeded over a period of about 30 years.

JNU STEM Bulletin January 2012

Page 24

7. As with other identification infrastructure (national residents databases, ID cards, etc.), civil rights activists have voiced concerns that iris-recognition technology might help governments to track individuals beyond their will. Disadvantages - The disadvantages are as follows: 1. Contact lenses are available which can change the colour of an individuals iris. These present a problem to any iris recognition system, since a fake iris pattern is printed on the surface of the lens, and will falsely reject an enrolled user, or falsely accept them, if the fake iris pattern has been enrolled in the database. Another problem to consider, although it would be quite minor, is that the border of any contact lens is slightly visible in an eye image, and this circular border may confuse the automatic segmentation algorithm in detecting it as the iris boundary. 2. The spectacles could introduce too much specular reflection resulting in failure of automatic segmentation and/or recognition. 3. The illumination in eye could create invisibility for accurate segmentation process. 4. The iris recognition process easily obscured by eyelashes, eyelids due to continuous blinking of eye. 5. Acquisition of iris image requires more training and attentiveness than other biometrics. The most prominent algorithms in each iris recognition stage would be discussed in next three sections. This paper discusses the pros and cons of algorithms implemented in each stage. In a secure working environment, it is important that the system authenticate the user that is connecting and using the system. Authentication means that the user is able to prove that they are who they say they are when they use the system. There are three basic ways to authenticate a user - using something the user knows, such as a password or a piece of personal information something the user has, such as a token or secure ID or something the user is, i.e. a biometric. Each of the three methods of authentication listed above has its own vulnerabilities. Iris recognition has become a popular research in recent year, due to its reliability and nearly perfect recognition rates, iris recognition is used in high security areas. This article provides a review of major iris recognition researches. An IRIS recognition system includes Image Acquisition, Localization, Normalization, and IRIS Pattern Recognition.

JNU STEM Bulletin January 2012

Page 25

3D Sensors to Allow Interact with Computers without Touch


Kavita Choudhary Department of Computer Science / Information Technology Jagan Nath University, Jaipur

The new emerging technology 3D sensors are worth mentioning that image sensors represent computers chips incorporated in digital cameras, the chips being sensitive to the light. Common sensors see and record 2D images, unlike the new generation sensors, which are able to see in 3D. Besides recording the picture, these sensors can also record the distance from the camera. Engineers believe that their latest invention could be used in various ways. For example, they could track movements throughout 3D space pictures as 3D objects. 3D sensors will allow people to interact with many devices in a much more natural way for example via simply looking at a screen and moving your hands you could have dramatic control over a gaming environment.

Video Games are most likely that this latest invention could be widely used in the games. The software giant, Microsoft, already made a step towards the no-touch interface with its Project Natal. Canestra is a company that in May presented a somewhat similar technology, allowing users to change channels and adjust the volume on their TV by simply waving their hands. According to vision if a standard Webcam is replaced with a three-dimensional sensor, users will have the ability to control their computers by moving their hands in front of it. In addition, the latest invention could be used in smart phones, transforming the touch-screen technology into a no-touch one. The 3D sensors in outside and inside cars could replace ultrasonic sensors that some carmakers place in the rear bumpers to signal drivers when they are close to hitting an object or a person. It specify that the sensors could be place inside a vehicle, replacing weight detectors when identifying whether a child is in a seat and if an air bag should deploy. Besides, the sensors could detect when there's someone in the vehicle when they aren't supposed to be.

JNU STEM Bulletin January 2012

Page 26

BIOPLASTIC: A Better Alternative For Sustainable Future


Dr Preeti Sharma and Dr Vikas Bishnoi Department Of Life and Allied Science , Jagan Nath University, Jaipur.

Conventional plastics, formed from fossil fuels, are one of the important materials for the society but they are created with a process which is harmful to the environment. In order to find alternatives, a new material has been developed known as bioplastic. Bioplastis are long chain of monomes joined with each other by ester bonds. These plastics are thus considered as polyesters. Bioplastics are classified into various types. The most common is PHA (Polyhydroxyalkanoate), which remains as a carbon and/or energy storage material in various microorganism under the condition of deficient nutritional elements. There are a variety of bioplastic applications in the society and industries. This review article is intended to provide information about alternatives to conventional plastics for the betterment of environment.

INTRODUCTION
Plastics are consumed in almost every place such as, in routine house hold packaging material, in bottles, cell phones, printers etc. It is also utilized by manufacturing industries ranging from pharmaceutical to automobiles. They are useful as synthetic polymers because, their structures can be chemically manipulated to a variety of strengths and shapes to obtain higher molecular weight, low reactivity and long durability substances. Plastics are important materials for the society not only because of their higher molecular weight and low reactivity but also for their durability and cost efficiency. Unfortunately these petroleum based plastics are not biodegradable. This result in one of the major causes of solid waste pollution through buried in landfills. They are indigestible and in many cases the animals die due to plastic blockage in the gut. Furthermore; Plastics are often soiled by food and other biological substances making physical recycling of this material undesirable. Incinerating plastics has been one option but other than being expensive it is also dangerous; various harmful chemicals like hydrogen chloride and hydrogen cyanide are released during its incineration. In recent years, there has been increasing public concern over the harmful effects of petrochemical derived plastic materials in the environment. Problem of managing plastic waste on the earth is increasing very rapidly now a days, and JNU STEM Bulletin January 2012 Page 27

studies have been intitated to find out suitable eco-friendly materials to minimize environmental problem. To find alternatives researchers have developed fully biodegradable plastics, which are disposed in environment and can easily degrade through the enzymatic actions of microorganisms. The degradation of biodegradable plastic produces carbon dioxide, methane, water, biomass, humic matter and various other natural substancs which can be readily eliminated. Due to its ability to degrade in the biotic environment, these types of material are termed as Bioplastics.

BIOPLASTICS
Bioplastics are a special type of biological material which is degradable and eco-friendly in their chemical nature. They are polyesters produced by a range of microorganisms; cultured under different nutrients and environmental conditions. Bioplastics are mainly classified into three types : Photodegradable and Semi-biodegradable plastic. The former have light sensitive groups incorporated directly into the backbone of the polymer as additives. Due to lacking of sunlight in landfill they remain non-degraded and not used widely, while the latter can be degraded by bacteria because they attack starch easily and remaining polymer released can be degraded by other bacteria. Due to presence of starch, bacteria attack and turn off avaliability of polyethylene fragments thereby remain non-degradable. The other type of biodegradable plastic is rather new and promising because of its actual production and utilization by bacteria to form biopolymer. These polymers, usually lipid in nature, are accumulated as storage material (in the form of mobile, amorphous, liquid granules) in microbes and allow microbial survival under stress conditions. This storage material is known as polyhydoxyalkanoates (PHAs), which store carbon and energy, when nutrient supplies are imbalanced. These polyesters, known as Bioplastics contain long chains of monomer which join with each other by ester bond. Bioplastics are accumulated when bacterial growth is limited by depletion of nitrogen, phosphorous or oxygen and excess carbon source is provided. There are a variety of materials which can be utilized as a carbon source for the production of Bioplastic.

BIOPLASTIC AND SOCIAL BENFITS


What makes bioplastic especially important is that petroleum oil price is increasing tremendously and its stock will be end in the near future. It is important for the global community to have an alternative for the product derived from petroleum oil such as plastics. PHAs at least will be a solution for the most of the industries and society, which largely depend on materials made from plastic. No new inventions can escape from the limitations and drawbacks and bioplastics too have some drawbacks. The most important JNU STEM Bulletin January 2012 Page 28

drawback for PHA production is its production cost, but the good news is that the price of PHA production is decreasing, whereas, petroleum oil price is increasing constantly. As a result, the gap between the petroleum oil and PHA prices are becoming very narrow. The first potential application of PHA polymers was recognized in the 1960s. PHA patents cover a wide range of PHAs products such as coating and packaging, bottles, cosmetic containers, golf tees, and pens. PHAs have also been processed into fibers, for a non oven fabrics material. PHAs can be used for all sorts of biodegradable packaging materials, including composting bags, food packaging, sanitary articles like diapers and fishing nets, biodegradable rubbers. PHAs are also used to develop scaffold for tissue engineering,and also posses numerous application in pharmacy and medical science.

Bioplatics have evolved into an innovative area of research for scientists around the world. This progressive development has been driven by a need for environmentally friendly substitutes for materials dervied from fossil fuel sources. In addition, recent high prices for crude oil, and the potential market for agricultural materials in bioplastics are driving an economic push toward expanding the bioplastic industry and provide better alternative for sustainable development of the future environment.

Examples of carbon source suitable for biotechnological production of PHAs.


Carbon source (s) Bacterial strain (s) Polymer produced Glucose, sugarbeet molasses Bacillus cereus PHB, terpolymer

Glucose, technical oleic acid, Pseudomonas aeruginosa waste free fatty acids, waste free frying oil Glucose, octanoic acid, Pseudomonas putida

mcl-PHAs

mcl-PHAs

undecenoic acid Glucose, soybean oil, Pseudomonas stutzeri Burkholderia cepacia mcl-PHAs PHB, PHBV Page 29

alcohols, alkanoates

JNU STEM Bulletin January 2012

Palm olein, palm stearin, crude palm oil, palm kernel oil, oleic acid, xylose, levulinic acid, sugarbeet molasses Malt, soy waste, milk waste, Staphylococcus epidermidis vinegar waste, oil Starch hydolysate, maltose, Halomonas boliviensis maltotetraose and maltohexaose PHB PHB

mcl-PHAs : medium-chain-length polyhydroxyalkanoates, PHB : poly (3-hydroxybutyrate), PHBV : poly(3-hydroxybutyrate-co-valerate)

JNU STEM Bulletin January 2012

Page 30

Benefit of FACTS Controllers in AC Transmission Systems Vishal Sharma Department of Electrical Engineeeing, Jagan Nath University, Jaipur
INTRODUCTION With the on-going expansion and growth of the electric utility industry, including deregulation in many countries, numerous changes are continuously being introduced to a once predictable business. Although electricity is a highly engineered product, it is increasingly being considered and handled as a commodity. Thus, transmission systems are being pushed closer to their stability and thermal limits while the focus on the quality of power delivered is greater than ever. In the evolving utility environment, financial and market forces are, and will continue to, demand a more optimal and profitable operation of the power system with respect to generation, transmission, and distribution. Now, more than ever, advanced technologies are paramount for the reliable and secure operation of power systems. To achieve both operational reliability and financial profitability, it has become clear that more efficient utilization and control of the existing transmission system infrastructure is required. Improved utilization of the existing power system is provided through the application of advanced control technologies. Power electronics based equipment, or Flexible AC Transmission Systems (FACTS), provide proven technical solutions to address these new operating challenges being presented today. FACTS technologies allow for improved transmission system operation with minimal infrastructure investment, environmental impact, and implementation time compared to the construction of new transmission lines.Traditional solutions to upgrading the electrical transmission system infrastructure have been primarily in the form of new transmission lines, substations, and associated equipment. However, as experiences have proven over the past decade or more, the process to permit, site, and construct new transmission lines has become extremely difficult, expensive, time-consuming, and controversial. FACTS technologies provide advanced solutions as cost-effective alternatives to new transmission line construction. The potential benefits of FACTS equipment are now widely recognized by the power systems engineering and T&D communities. With respect to FACTS equipment, voltage sourced converter (VSC) technology, which utilizes self-commutated thyristors/transistors such as GTOs, GCTs, IGCTs, and IGBTs, has been successfully applied in a number of installations world-wide for Static Synchronous Compensators (STATCOM), Unified Power Flow Controllers (UPFC), and Convertible Series Compensators CSC, back-to-back dc ties (VSC-BTB) and VSC transmission. In addition to these referenced and other applications, there are several recently completed STATCOMs in the U.S., in the states of Vermont California and Texas [no references available]. In addition, there are newly planned STATCOMs in Connecticut and Texas, as well as a small STATCOM (D-VAR) planned for BC Hydro and several other locations. A. Generation, Transmission, Distribution When discussing the creation, movement, and utilization of electrical power, it can be separated into three areas, which traditionally determined the way in which electric utility companies had been organized. These are illustrated in

JNU STEM Bulletin January 2012

Page 31

Although power electronic based equipment is prevalent in each of these three areas, such as with static excitation systems for generators and Custom Power equipment in distribution systems the focus of this paper and accompanying presentation is on transmission, that is, moving the power from where it is generated to where it is utilized. B. Power System Constraints As noted in the introduction, transmission systems are being pushed closer to their stability and thermal limits while the focus on the quality of power delivered is greater than ever. The limitations of the transmission system can take many forms and may involve power transfer between areas (referred to here as transmission bottlenecks) or within a single area or region (referred to here as a regional constraint) and may include one or more of the following characteristics:
Steady-State Power Transfer Limit Voltage Stability Limit Dynamic Voltage Limit Transient Stability Limit Power System Oscillation Damping Limit Inadvertent Loop Flow Limit Thermal Limit Short-Circuit Current Limit Others

Each transmission bottleneck or regional constraint may have one or more of these system-level problems. The key to solving these problems in the most cost-effective and coordinated manner is by thorough systems engineering analysis, as described later in this paper. C. Controllability of Power Systems To illustrate that the power system only has certain variables that can be impacted by control, consider the basic and well-known power-angle curve. Although this is a steady-state curve and the implementation of FACTS is primarily for dynamic issues, this illustration demonstrates the point that there are primarily three main variables that can be directly controlled in the power system to impact its performance. These are:
Voltage Angle Impedance

One could also make the point that direct control of power is a fourth variable of controllability in power systems. JNU STEM Bulletin January 2012 Page 32

Figure 2. Illustration of controllability of power systems D. Examples of Conventional Equipment For Enhancing Power System Control
Series Capacitor (Controls impedance) Switched Shunt (Capacitor and Reactor -Controls voltage) Transformer LTC (Controls voltage) Phase Shifting Transformer (Controls angle) Synchronous Condenser (Controls voltage) Special Stability Controls (Typically focuses on voltage control but can include direct control of power)

E. Example of FACTS Controllers for Enhancing Power System Control


Static Synchronous Compensator (STATCOM) Static Var Compensator (SVC) Unified Power Flow Controller (UPFC) Convertible Series Compensator (CSC) Inter-phase Power Flow Controller (IPFC) , Static Synchronous Series Controller (SSSC) Thyristor Controlled Series Compensator (TCSC) and Thyristor Controlled Phase Shifting Transformer (TCPST) Super Conducting Magnetic Energy Storage (SMES)

F. Benefits of Control of Power Systems Once power system constraints are identified and through system studies viable solutions options are identified, the benefits of the added power system control must be determined. The following offers a list of such benefits:
Increased Loading and More Effective Use of Transmission Corridors Added Power Flow Control ,Improved Power System Stability Increased System Security , Increased System Reliability

The advantages in this list are important to achieve in the overall planning and operation of power systems. However, for justifying the costs of implementing added power system control and for comparing conventional solutions to FACTS controllers, more specific metrics of the benefits to the power system are often required. Such benefits can usually be tied back to an area or region for a specific season and year at a defined dispatch (usually given by an ISO or equivalent) while meeting the following criteria, for example Voltage Stability Criteria P-V voltage or power criteria with minimum margins, Q-V reactive power criteria with minimum margins, Dynamic Voltage Criteria e.g., Avoiding voltage collapse e.g., Minimum transient voltage dip/sag criteria (magnitude and duration), Transient Stability Criteria, Power System Oscillation Damping e.g., Minimum damping ratio. JNU STEM Bulletin January 2012 Page 33

QUALITY MANAGEMENT SYSTEM Vipin Goyal Department of Mechanical Engineering, Jagan Nath University, Jaipur
India as an industrialized country is making rapid strides in the industrial sector with a very high growth rate of 10-12% per annum,. Along with these, new opportunities such as Metro Rail network expansion which brings large industries from around the world such as Bombardier etc to India and requires setting huge supply chain base in the country to manufactures Metro Coaches which are very technology specific. Similarly Aerospaces Industry and its supply chain is also bound to grow. This is apart from the fact that the automobile sector is going to make India a world hub for small car and its parts .These developments have a cascading effect on the industrial scenario with large vendor base, service supports, consulting firms etc. All of these lead to a very high requirement of competent man power in this sector. There is a huge dichotomy between what is required by the industry and what is provided in the curriculum of science and technology(Engineering) streams in Rajasthan, as a result the students of Rajasthan of these streams could not avail the opportunities that arise out of rapid expansion of the industrial sector particularly Light medium Engineering ,Automobile ,Service industry ,etc as the students are less than prepared to meet such requirement and challenges , Similarly the faculty of the these streams are also unaware of such subjects because of such of lack of exposure to the industrial scenario, this in turn lead to low level on industrialization of the state. In the face of such kind of requirement the science and technology stream of students, particularly those from mechanical engineering background should prepare themselves to understand the industry requirements and the tools which go along with them. Quality Terminology Definitions of important terms used in Industries obtained from documents of professional societies and organizations such as the American Society for Quality [ASQ], International Standards Organization (ISO) etc. These terms are used in almost all of the top corporations in this world, a understanding of these terms in to be discussed in the sessions for e.g. one such word is QFD Quality Function Deployment (QFD) is a structured approach to defining customer needs or requirements and translating them into specific plans to produce products to meet those needs. Others words are Quality, Process , Bench Marking, JIT, Kanban, Kaizen, DOE, Concurrent Engineering, ERP etc ISO9001/ TS 16949 ISO9001:2008 A quality management system (QMS) is a set of policies, processes and procedures required for planning and execution (production/development/service) in the core business area of an organization. (i.e. areas that can impact the organization's JNU STEM Bulletin January 2012 Page 34

ability to meet customer requirements.) it refers to the entire system - the documents just describe it. The ISO/TS16949 is an international standard aiming to the development of a quality management system that provides for continual improvement, emphasizing defect prevention and the reduction of variation and waste in the supply chain. Total Quality Management Total quality management (TQM) is a management philosophy that seeks to integrate all organizational functions (marketing, finance, design, engineering, and production, customer service, etc.) to focus on meeting customer needs and other organizational objectives. TQM empowers an entire organization, from the most junior employee to the CEO, with the responsibility of ensuring quality in their processes. A large no. of organizations have deployed TQM, it is a philosophy appropriate to any situation in which quality assurance is important. Lean Management Lean manufacturing or lean production, which is often known simply as "Lean", is a production practice that considers the expenditure of resources for any goal other than the creation of value for the end customer to be wasteful, and thus a target for elimination. Lean manufacturing is a generic process management philosophy derived mostly from the Toyota Production System (TPS) (hence the term Toyotism is also prevalent). It is renowned for its focus on reduction of the original Toyota seven wastes to improve overall customer value, Lean manufacturing is a variation on the theme of efficiency based on optimizing flow; it is a present-day instance of the recurring theme in human history toward increasing efficiency, decreasing waste, and using empirical methods to decide what matters, rather than uncritically accepting pre-existing ideas. SIX SIGMA Six Sigma is a systematical process of quality improvement through the disciplined data-analyzing approach, and by improving the organizational process by eliminating the defects or the obstacles which prevents the organizations to reach the perfection. Sigma is a letter in the Greek alphabet used to denote the standard deviation of a process. Sigma quality level is sometimes used to describe the output of a process. Six Sigma can now apply a whole culture of strategies, tools, and statistical methodologies to improve the bottom line of companies and quantum gains in quality. Six Sigma has helped various organizations achieve these objectives Total Productive Maintenance (TPM) Total Productive Maintenance (TPM) is a maintenance program which involves a newly defined concept for maintaining plants and equipment. The goal of the TPM program is to markedly increase production while, at the same time, increasing employee morale and job satisfaction. TPM aims to achieve -Avoiding wastage in a quickly changing economic environment, Producing goods without reducing product quality, Reducing JNU STEM Bulletin January 2012 Page 35

cost, etc

Environment and ISO14001 ISO 14001 is the internationally recognized standard for the environmental management of businesses. It prescribes controls for those activities that have an effect on the environment. These include the use of natural resources, handling and treatment of waste and energy consumption. It offers source of guidance for introducing and adopting environmental management systems based on the best universal practices, in the same way that the ISO 9000 series on quality management systems, which is now widely applied, represents a tool for technology transfer of the best available quality management practices. Pollution Control Techniques Industrial pollution has become a serious problem in many developing countries. Its costs include serious damage to human health and ecosystems, and direct economic costs for households and businesses. Adoption of cost effective cleaner technologies should be encouraged. Implementation of waste minimization techniques and adoption of appropriate pollution control measures OHSAS 18001 /Safety & PPEs Safety is the state of being "safe" the condition of being protected against physical, social, financial, occupational, psychological or other types or consequences of failure, damage, error, accidents, harm or any other event which could be considered nondesirable. OHSAS 18001 is an international occupational health and safety (OHS) management system standard. designed to enable organizations to demonstrate their commitment to providing a safe and efficient working environment by identifying and understanding to eliminate or minimise risk to employees and other interested parties who may be exposed to OH&S risks associated with its activities Time Management Time management is important for everyone. It is the art of arranging, organizing, scheduling, and budgeting ones time for the purpose of generating more effective work and productivity It refers to a range of skills, tools, and techniques used to manage time when accomplishing specific tasks, projects and goals. This set encompasses a wide scope of activities, and these include planning, allocating, setting goals, delegation, analysis of time spent, monitoring, organizing, scheduling, and prioritizing. A time management system is a designed combination of processes, tools and techniques. Leadership - Through Public Speaking and Communication Leadership has been written as the process of social influence in which one person can enlist the aid and support of others in the accomplishment of a common task. JNU STEM Bulletin January 2012 Page 36

Leadership remains one of the most relevant aspects of the organizational context and as part of leadership development requires both excellence in public speaking and communications. Public speaking is the process of speaking to a group of people in a structured, deliberate manner intended to inform, influence, or entertain the listeners. Communication skills are at the heart of interpersonal skills and the greater your awareness of how it all works, the more effective your communication will be Business Etiquettes Business etiquettes are defined by good manners, loyalty and commitment towards one's organizations, following right business etiquettes is essential for positive career growth Both Social and business etiquette overlap considerably with basic tenets of netiquette, the conventions for using computer networks. These rules are often echoed throughout an industry or economy. For instance, 49% of employers surveyed in 2008 by the American National Association of Colleges and Employers found that non-traditional attire would be a "strong influence" on their opinion of a potential job candidate. Statistical Process Control (SPC) Statistical process control (SPC) is the application of statistical methods to the monitoring and control of a process to ensure that it operates at its full potential to produce conforming product. The 7 QC Tools are fixed set of graphical techniques identified as being most helpful in troubleshooting issues related to quality are used to solve the vast majority of quality-related issues. The tools are the cause-and-effect or Ishikawa diagram, check sheet, control chart, histogram, pareto chart, Scatter diagram, Stratification (alternately flow chart or run chart) Failure Modes and Effects Analysis FMEA A failure modes and effects analysis (FMEA), is a procedure in operations management for analysis of potential failure modes within a system for classification by severity or determination of the effect of failures on the system. It is widely used in manufacturing industries in various phases of the product life cycle and is now increasingly finding use in the service industry. Failure modes are any errors or defects in a process, design, or item, especially those that affect the customer, and can be potential or actual. Effects analysis refers to studying the consequences of those failures. APQP, Control Plan & PPAP Advanced Product Quality Planning (or APQP) is a framework of procedures and techniques used to develop products in industry, particularly the automotive industry. It is quite similar to the concept of Design For Six Sigma (DFSS). The Production Part Approval Process ('PPAP) is used in the automotive supply chain to establish confidence in component suppliers and their production processes, by demonstrating that.all customer engineering design record and specification requirements are properly understood by the supplier and that the process has the JNU STEM Bulletin January 2012 Page 37

potential to produce product consistently meeting these requirements during an actual production run at the quoted production rate."

MS Excel & MS Access Microsoft Excel (full name Microsoft Office Excel) is a spreadsheet application written and distributed for Microsoft Windows and Mac OS X. It features calculation, graphing tools, pivot tables and a macro programming language called VBA (Visual Basic for Applications). It has been the most widely used spreadsheet application available for these platforms Excel forms part of Microsoft Office. Microsoft Office Access, is a relational database management system from Microsoft that combines the relational Microsoft Jet Database Engine with a graphical user interface and software development tools. It is a member of the Microsoft Office suite of applications, included in the Professional and higher editions or sold separately. Project Management Project management is the discipline of planning, organizing, and managing resources to bring about the successful completion of specific project goals and objectives. The primary challenge of project management is to achieve all of the project goals and objectives while honoring the preconceived project constraints. Typical constraints are scope, time, and budget.1 The secondaryand more ambitiouschallenge is to optimize the allocation and integration of inputs necessary to meet pre-defined objectives.

JNU STEM Bulletin January 2012

Page 38

You might also like