Subscribe

RSS Feed (xml)

Powered By

Skin Design:
Free Blogger Skins

Powered by Blogger

Wednesday, February 27, 2008

article : Understanding Intelligent Design Theory

by Babu Ranganathan
theconservativevoice.com
January 19, 2008 01:00 PM EST

Imagine finding a planet where robots are programmed so that they can
make other robots just like themselves from raw materials.
Now, imagine an alien visitor coming to the planet and, after many
years of studying these robots, coming to the conclusion that since
science can explain how these robots work, function, and reproduce
there's no reason to believe that there was an ultimate intelligent
designer behind them.

The analogy above certainly is not perfect but it is sufficient to
reveal the fallacious thinking of those who attack intelligent design
behind life and the universe.

Chance physical processes can produce some level of order but it is
not rational to believe that the highest levels of order in life and
the universe are by chance. For example, amino acids have been shown
to be able to come into existence by chance but not more complex
molecules or structures such as proteins which require that the
various amino acids be in a precise sequence, just like the letters in
a sentence. If they're not in the right sequence the protein molecules
will not function. A single cell alone has millions of protein
molecules!

There is no innate chemical tendency for the various amino acids to
bond with one another in a sequence. Any one amino acid can just as
easily bond with any other. The only reason at all for why the various
amino acids bond with one another in a precise sequence in the cells
of our bodies is because they're directed to do so by an already
existing sequence of molecules in our genetic code. Without being in a
proper sequence protein molecules will not function.

The sequence of molecules in DNA (the genetic code) determines the
sequence of molecules in proteins. Furthermore, without DNA there
cannot be RNA, and without RNA there cannot be DNA. And without either
DNA or RNA there cannot be proteins, and without proteins there cannot
be DNA or RNA. They're all mutually dependent upon each other for
existence!

If the cell had evolved it would have had to be all at once. A
partially evolved cell cannot wait millions of years to become
complete because it would be highly unstable and quickly disintegrate
in the open environment, especially without the protection of a
complete and fully functioning cell membrane.
Of course, once there is a complete and living cell then the genetic
progam and higly complex biological mechanisms exist to direct the
formation of more cells. The cell's genetic code and biological
machinery will use and direct the raw materials, entering the cell
from the environment, into forming more cells. The question for
evolutionists and naturalists is how did the cell or life come about
when there was no directing mechanism in Nature.

If humans must use intelligence to perform genetic engineering, to
meaningfully manipulate the genetic code, then what does that say
about the origin of the genetic code itself!

Contrary to popular belief, scientists have never created life in the
laboratory. What scientists have done is genetically alter or engineer
already existing forms of life, and by doing this scientists have been
able to produce new forms of life. However, they did not produce these
new life forms from non-living matter. Even if scientists ever do
produce life from non-living matter it won't be by chance so it still
wouldn't help support any argument for evolution.

Even in the recent case, as reported in the news, involving the
creation of what is called synthetic (or artificial) life, scientists
don't actually create or produce life itself from non-living matter.
What scientists do in this case is create (by intelligent design)
artificial DNA (genetic instructions and code) which is then implanted
into an already existing living cell and, thereby, changing that cell
into a new form of life. And, again, even if scientists ever do create
a whole living cell from scratch (and not just its DNA) it still would
not be by chance but by intelligent design. Synthetic life is another
form of genetic engineering. But God was there first. Remember that!
The great British scientist Sir Frederick Hoyle has said that the
probability of the sequence of molecules in the simplest cell coming
into existence by chance is equivalent to a tornado going through a
junk yard of airplane parts and assembling a 747 Jumbo Jet!
Considering the enormous complexity of life, it is much more logical
to believe that the genetic and biological similarities between all
species is due to a common Designer rather than common evolutionary
ancestry. It is only logical that the great Designer would design
similar functions for similar purposes and different functions for
different purposes in all of the various forms of life.
What if we should find evidence of life on Mars? Wouldn't that prove
evolution? No. It wouldn't be proof that such life had evolved from
non-living matter by chance natural processes. And even if we did find
evidence of life on Mars it would have most likely have come from our
very own planet - Earth! In the Earth's past there was powerful
volcanic activity which could have easily spewed dirt containing
microbes into outer space which eventually could have reached Mars. A
Newsweek article of September 21, 1998, p.12 mentions exactly this
possibility.

We know from the law of entropy in science that the universe does not
have the ability to have sustained itself from all eternity. It
requires a beginning. But, we also know from science that natural laws
could not have brought the universe into being from nothing. The
beginning of the universe, therefore, points to a supernatural
origin!

Even the scientific followers of Prigogine, the father of Chaos
theory, have admitted that only a very minimal level of order will
ever be possible as a result of spontaneous or chance processes.
Those advocating the teaching of intelligent design are not demanding
that Darwinian theory no longer be taught. Rather, the advocates of
intelligent design want the merits of both theories taught side by
side when the issue of origins is covered in science classes and
textbooks. This is only fair.

Science cannot prove how life originated since no human observed the
origin of life by either chance or design. Observation and detection
by the human senses, either directly or indirectly through scientific
instruments, is the basis of science and for establishing proof. The
issue is which position has better scientific support. Both sides
should have the opportunity to present their case.
What we believe about life's origins does influence our philosophy and
value of life as well as our view of ourselves and others. This is no
small issue!

Just because the laws of science can explain how life and the universe
operate and work doesn't mean there is no Maker. Would it be rational
to believe that there's no designer behind airplanes because the laws
of science can explain how airplanes operate and work?
Natural laws are adequate to explain how the order in life, the
universe, and even a microwave oven operates, but mere undirected
natural laws cannot fully explain the origin of such order.

If some astronauts from Earth discovered figures of persons similar to
Mt. Rushmore on an uninhabited planet there would be no way to
scientifically prove the carved figures originated by design or by
chance processes of erosion. Neither position is science, but
scientific arguments may be made to support one or the other.
All of this simply means that real science supports faith in God.
Science cannot prove that we are here by chance (evolution) or by
design (creation). However, the scientific evidence can be used to
support one or the other.

It is only fair that evidence supporting intelligent design be
presented to students alongside of evolutionary theory, especially in
public schools which receive funding from taxpayers who are on both
sides of the issue. Also, no one is being forced to believe in God or
adopt a particular religion so there is no true violation of
separation of church and state.

The best little article ever written refuting the origin of life by
chance is "A Few Reasons an Evolutionary Origin of Life Is Impossible"
by scientist and biochemist Dr. Duane T. Gish. Dr. Gish presents
"simple" but profound scientific barriers to evolution of life which
aren't mentioned or covered in Johnny's high school biology textbook
or in college textbooks for that matter. This article is truly great!
Dr. Gish's aricle may be accessed for reading at: http://icr.org/article/3140/
where also links to other articles may be found.

Trust me, Dawkins and all the evolutionists put together can't hold a
candle to the scientific genius of Dr. Gish. Just read one of Dr.
Gish's books and you'll see why. Dr. Gish has successfully debated
hundreds of evolution scientists in secular colleges and universities
across the nation over the past two decades, and students have
consistently voted him the winner in all of those debates. Don't try
looking for this news in the main stream media. You won't find it
there anymore than you'll find a half-evolved chipmunk running around
in your backyard!

There is, of course, much more to be said on this subject. Scientist,
creationist, debater, writer, and lecturer, Dr. Walt Brown covers
various scientific issues ( i.e. fossils, biological variation and
diversity, the origin of life, comparative anatomy and embryology, the
issue of vestigial organs, the age of the earth, etc. ) at greater
depth on his website at http://www.creationscience.com. Another
excellent source of information from highly qualified scientists who
are creationists is the Institute for Creation Research
(http:// www.icr.org) in San Diego, California.

Read More......

Tuesday, February 26, 2008

article : You Must Know : How Video games activate reward regions of brain in men more than women stanford study finds

Allan Reiss, MD, and his colleagues have a pretty good idea why your husband or boyfriend can't put down the Halo 3. In a first-of-its-kind imaging study, the Stanford University School of Medicine researchers have shown that the part of the brain that generates rewarding feelings is more activated in men than women during video-game play.

"These gender differences may help explain why males are more attracted to, and more likely to become 'hooked' on video games than females," the researchers wrote in their paper, which was recently published online in the Journal of Psychiatric Research.

More than 230 million video and computer games were sold in 2005, and polls show that 40 percent of Americans play games on a computer or a console. According to a 2007 Harris Interactive survey, young males are two to three times more likely than females to feel addicted to video games, such as the Halo series so popular in recent years.

Despite the popularity of video and computer games, little is known about the neural processes that occur as people play these games. And no research had been done on gender-specific differences in the brain's response to video games.

Reiss, senior author of the study and the Howard C. Robbins Professor of Psychiatry and Behavioral Sciences, has long been interested in studying gender differences; in 2005, he published a study showing that men and women process humor differently. He and his colleagues became interested in exploring the concept of territoriality, and they determined the best way to do so was with a simple computer game.

The researchers designed a game involving a vertical line (the "wall") in the middle of a computer screen. When the game begins, 10 balls appear to the right of the wall and travel left toward the wall. Each time a ball is clicked, it disappears from the screen. If the balls are kept a certain distance from the wall, the wall moves to the right and the player gains territory, or space, on the screen. If a ball hits the wall before it's clicked, the line moves to the left and the player loses territory on the screen.

During this study, 22 young adults (11 men and 11 women) played numerous 24-second intervals of the game while being hooked up to a functional magnetic resonance imaging, or fMRI, machine. fMRI is designed to produce a dynamic image showing which parts of the brain are working during a given activity.

Study participants were instructed to click as many balls as possible; they weren't told that they could gain or lose territory depending on what they did with the balls. Reiss said all participants quickly learned the point of the game, and the male and female participants wound up clicking on the same number of balls. The men, however, wound up gaining a significantly greater amount of space than the women. That's because the men identified which balls - the ones closest to the "wall" - would help them acquire the most space if clicked.

"The females 'got' the game, and they moved the wall in the direction you would expect," said Reiss, who is director of the Center for Interdisciplinary Brain Sciences Research. "They appeared motivated to succeed at the game. The males were just a lot more motivated to succeed."

After analyzing the imaging data for the entire group, the researchers found that the participants showed activation in the brain's mesocorticolimbic center, the region typically associated with reward and addiction. Male brains, however, showed much greater activation, and the amount of activation was correlated with how much territory they gained. (This wasn't the case with women.) Three structures within the reward circuit - the nucleus accumbens, amygdala and orbitofrontal cortex - were also shown to influence each other much more in men than in women. And the better connected this circuit was, the better males performed in the game.

The findings indicate, the researchers said, that successfully acquiring territory in a computer game format is more rewarding for men than for women. And Reiss, for one, isn't surprised. "I think it's fair to say that males tend to be more intrinsically territorial," he said. "It doesn't take a genius to figure out who historically are the conquerors and tyrants of our species-they're the males."

Reiss said this research also suggests that males have neural circuitry that makes them more liable than women to feel rewarded by a computer game with a territorial component and then more motivated to continue game-playing behavior. Based on this, he said, it makes sense that males are more prone to getting hooked on video games than females.
"Most of the computer games that are really popular with males are territory- and aggression-type games," he pointed out.

Reiss said the team's findings may apply to other types of video and computer games. "This is a fairly representative, generic computer game," he said, adding that he and his colleagues are planning further work in this area.
Source : Stanford University Medical Center

Read More......

Friday, February 22, 2008

article : Molecular Biology & Biological Science

Molecular biology is the study of biology at a molecular level. The field overlaps with other areas of biology and chemistry, particularly genetics and biochemistry. Molecular biology chiefly concerns itself with understanding the interactions between the various systems of a cell, including the interactions between DNA, RNA and protein biosynthesis and learning how these interactions are regulated.

Writing in Nature, William Astbury described molecular biology as:

"... not so much a technique as an approach, an approach from the viewpoint of the so-called basic sciences with the leading idea of searching below the large-scale manifestations of classical biology for the corresponding molecular plan. It is concerned particularly with the forms of biological molecules and ..... is predominantly three-dimensional and structural - which does not mean, however, that it is merely a refinement of morphology - it must at the same time inquire into genesis and function." [1]

Relationship to other "molecular-scale" biological sciences

Schematic relationship between biochemistry, genetics and molecular biology

Researchers in molecular biology use specific techniques native to molecular biology (see Techniques section later in article), but increasingly combine these with techniques and ideas from genetics and biochemistry. There is not a defined line between these disciplines. Today the terms molecular biology and biochemistry are nearly interchangeable. The following figure is a schematic that depicts one possible view of the relationship between the fields:

  • Biochemistry is the study of the chemical substances and vital processes occurring in living organisms. Biochemists focus heavily on the role, function, and structure of biomolecules. The study of the chemistry behind biological processes and the synthesis of biologically active molecules are examples of biochemistry.
  • Genetics is the study of the effect of genetic differences on organisms. Often this can be inferred by the absence of a normal component (e.g. one gene). The study of "mutants" – organisms which lack one or more functional components with respect to the so-called "wild type" or normal phenotype. Genetic interactions such as epistasis can often confound simple interpretations of such "knock-out" studies.
  • Molecular biology is the study of molecular underpinnings of the process of replication, transcription and translation of the genetic material. The central dogma of molecular biology where genetic material is transcribed into RNA and then translated into protein, despite being an oversimplified picture of molecular biology, still provides a good starting point for understanding the field. This picture, however, is undergoing revision in light of emerging novel roles for RNA.
Much of the work in molecular biology is quantitative, and recently much work has been done at the interface of molecular biology and computer science in bioinformatics and computational biology. As of the early 2000s, the study of gene structure and function, molecular genetics, has been amongst the most prominent sub-field of molecular biology.

Increasingly many other fields of biology focus on molecules, either directly studying their interactions in their own right such as in cell biology and developmental biology, or indirectly, where the techniques of molecular biology are used to infer historical attributes of populations or species, as in fields in evolutionary biology such as population genetics and phylogenetics. There is also a long tradition of studying biomolecules "from the ground up" in biophysics.

Techniques of molecular biology
Since the late 1950s and early 1960s, molecular biologists have learned to characterize, isolate, and manipulate the molecular components of cells and organisms. These components include DNA, the repository of genetic information; RNA, a close relative of DNA whose functions range from serving as a temporary working copy of DNA to actual structural and enzymatic functions as well as a functional and structural part of the translational apparatus; and proteins, the major structural and enzymatic type of molecule in cells.

Expression Cloning
Main article: Expression cloning

One of the most basic techniques of molecular biology to study protein function is expression cloning. In this technique, DNA coding for a protein of interest is cloned (using PCR and/or restriction enzymes) into a plasmid (known as an expression vector). This plasmid may have special promoter elements to drive production of the protein of interest, and may also have antibiotic resistance markers to help follow the plasmid.

This plasmid can be inserted into either bacterial or animal cells. Introducing DNA into bacterial cells is called transformation, and can be completed with several methods, including electroporation, microinjection, passive uptake and conjugation. Introducing DNA into eukaryotic cells, such as animal cells, is called transfection. Several different transfection techniques are available, including calcium phosphate transfection, liposome transfection, and proprietary transfection reagents such as Fugene. DNA can also be introduced into cells using viruses or pathenogenic bacteria as carriers. In such cases, the technique is called viral/bacterial transduction, and the cells are said to be transduced.

In either case, DNA coding for a protein of interest is now inside a cell, and the protein can now be expressed. A variety of systems, such as inducible promoters and specific cell-signaling factors, are available to help express the protein of interest at high levels. Large quantities of a protein can then be extracted from the bacterial or eukaryotic cell. The protein can be tested for enzymatic activity under a variety of situations, the protein may be crystallized so its tertiary structure can be studied, or, in the pharmaceutical industry, the activity of new drugs against the protein can be studied.


Polymerase chain reaction (PCR)
Main article: Polymerase chain reaction

The polymerase chain reaction is an extremely versatile technique for copying DNA. In brief, PCR allows a single DNA sequence to be copied (millions of times), or altered in predetermined ways. For example, PCR can be used to introduce restriction enzyme sites, or to mutate (change) particular bases of DNA. PCR can also be used to determine whether a particular DNA fragment is found in a cDNA library. PCR has many variations, like reverse transcription PCR (RT-PCR) for amplification of RNA, and, more recently, real-time PCR (qPCR) which allow for quantitative measurement of DNA or RNA molecules.

Gel electrophoresis
Main article: Gel electrophoresis
Gel electrophoresis is one of the principal tools of molecular biology. The basic principle is that DNA, RNA, and proteins can all be separated by means of an electric field. In agarose gel electrophoresis, DNA and RNA can be separated on the basis of size by running the DNA through an agarose gel. Proteins can be separated on the basis of size by using an SDS-PAGE gel, or on the basis of size and their electric charge by using what is known as a 2d gel.

Southern blotting
Main article: Southern blot

Named after its inventor, biologist Edwin Southern, the Southern blot is a method for probing for the presence of a specific DNA sequence within a DNA sample. DNA samples before or after restriction enzyme digestion are separated by gel electrophoresis and then transferred to a membrane by blotting via capillary action. The membrane can then be probed using a DNA probe labeled using a complement of the sequence of interest. Most original protocols used radioactive labels, however non-radioactive alternatives are now available. Southern blotting is less commonly used in laboratory science due to the capacity of using PCR to detect specific DNA sequences from DNA samples. These blots are still used for some applications, however, such as measuring transgene copy number in transgenic mice, or in the engineering of gene knockout embryonic stem cell lines.

Northern blotting
Main article: Northern blot

The Northern blot is used to study the expression patterns a specific type of RNA molecule as relative comparison among of a set of different samples of RNA. It is essentially a combination of denaturing RNA gel electrophoresis, and a blot. In this process RNA is separated based on size and is then transferred to a membrane that is then probed with a labeled complement of a sequence of interest. The results may be visualized through a variety of ways depending on the label used; however, most result in the revelation of bands representing the sizes of the RNA detected in sample. The intensity of these bands is related to the amount of the target RNA in the samples analyzed. The procedure is commonly used to study when and how much gene expression is occurring by measuring how much of that RNA is present in different samples. It is one of the most basic tools for determining at what time, and under what conditions, certain genes are expressed in living tissues.

Western blotting
Main article: western blot

Antibodies to most proteins can be created by injecting small amounts of the protein into an animal such as a mouse, rabbit, sheep, or donkey (polyclonal antibodies)or produced in cell culture (monoclonal antibodies). These antibodies can be used for a variety of analytical and preparative techniques.

In western blotting, proteins are first separated by size, in a thin gel sandwiched between two glass plates in a technique known as SDS-PAGE (sodium dodecyl sulphate polyacrylamide gel electrophoresis). The proteins in the gel are then transferred to a PVDF, nitrocellulose, nylon or other support membrane. This membrane can then be probed with solutions of antibodies. Antibodies that specifically bind to the protein of interest can then be visualized by a variety of techniques, including coloured products, chemiluminescence, or autoradiography.

Analogous methods to western blotting can also be used to directly stain specific proteins in cells and tissue sections. However, these immunostaining methods are typically more associated with cell biology than molecular biology.
The terms "western" and "northern" are jokes: The first blots were with DNA, and since they were done by Ed Southern, they came to be known as Southerns. Patricia Thomas, inventor of the RNA blot, which became known as a "northern", actually didn't use the term. [2]. To carry the joke further, one can find reference in the literature [1] to "southwesterns" (Protein-DNA interactions) and "farwesterns" (Protein-Protein interactions).

Arrays
Main article: DNA microarray

A DNA array is a collection of spots attached to a solid support such as a microscope slide; each spot contains one or more DNA oligonucleotides. Arrays make it possible to put down a large number of very small (100 micrometre diameter) spots on a single slide; if each spot has a DNA molecule that is complementary to a single gene (similar to Southern blotting), one can analyze the expression of every gene in an organism in a single expression profiling experiment . For instance, the common baker's yeast, Saccharomyces cerevisiae, contains about 7000 genes; with a microarray, one can measure quantitatively, how each gene is expressed, and how that expression changes, for example, with a change in temperature. There are many different ways to fabricate microarrays; the most common are silicon chips, microscope slides with spots of ~ 100 micrometre diameter, custom arrays, and arrays with larger spots on porous membranes (macroarrays).

Arrays can also be made with molecules other than DNA. For example, an antibody array can be used to determine what proteins or bacteria are present in a blood sample.

Abandoned technology
As new procedures and technology become available, the older technology is rapidly abandoned. A good example is methods for determining the size of DNA molecules. Prior to gel electrophoresis (agarose or polyacrylamide) DNA was sized with rate sedimentation in sucrose gradients, a slow and labor intensive technology requiring expensive instrumentation; prior to sucrose gradients, viscometry was used.

Aside from their historical interest, it is worth knowing about older technology as it may be useful to solve a particular problem.

History
Main article: History of molecular biology

Molecular biology was established in the 1930s, the term was first coined by Warren Weaver in 1938 however. Warren was director of Natural Sciences for the Rockefeller Foundation at the time and believed that biology was about to undergo a period of significant change given recent advances in fields such as X-ray crystallography. He therefore channeled significant amounts of (Rockefeller Institute) money into biological fields.


Read More......

Monday, February 18, 2008

article : What is Biotechnology ?

Biotechnology. Gene therapy using an Adenovirus vector. A new gene is inserted into an adenovirus vector, which is used to introduce the modified DNA into a human cell. If the treatment is successful, the new gene will make a functional protein.

Gene therapy may be used for treating, or even curing, genetic and acquired diseases like cancer and AIDS by using normal genes to supplement or replace defective genes or to bolster a normal function such as immunity. It can be used to target somatic (i.e., body) or germ (i.e., egg and sperm) cells. In somatic gene therapy, the genome of the recipient is changed, but this change is not passed along to the next generation. In contrast, in germline gene therapy, the egg and sperm cells of the parents are changed for the purpose of passing on the changes to their offspring.

There are basically two ways of implementing a gene therapy treatment:
1. Ex vivo, which means “outside the body” – Cells from the patient’s blood or bone marrow are removed and grown in the laboratory. They are then exposed to a virus carrying the desired gene. The virus enters the cells, and the desired gene becomes part of the DNA of the cells. The cells are allowed to grow in the laboratory before being returned to the patient by injection into a vein.

2. In vivo, which means “inside the body” – No cells are removed from the patient’s body. Instead, vectors are used to deliver the desired gene to cells in the patient’s body.
Currently, the use of gene therapy is limited. Somatic gene therapy is primarily at the experimental stage. Germline therapy is the subject of much discussion but it is not being actively investigated in larger animals and human beings.

As of June 2001, more than 500 clinical gene-therapy trials involving about 3,500 patients have been identified worldwide. Around 78% of these are in the United States, with Europe having 18%. These trials focus on various types of cancer, although other multigenic diseases are being studied as well. Recently, two children born with severe combined immunodeficiency disorder (“SCID”) were reported to have been cured after being given genetically engineered cells.
Gene therapy faces many obstacles before it can become a practical approach for treating disease.[10] At least four of these obstacles are as follows:

1. Gene delivery tools. Genes are inserted into the body using gene carriers called vectors. The most common vectors now are viruses, which have evolved a way of encapsulating and delivering their genes to human cells in a pathogenic manner. Scientists manipulate the genome of the virus by removing the disease-causing genes and inserting the therapeutic genes. However, while viruses are effective, they can introduce problems like toxicity, immune and inflammatory responses, and gene control and targeting issues.

2. Limited knowledge of the functions of genes. Scientists currently know the functions of only a few genes. Hence, gene therapy can address only some genes that cause a particular disease. Worse, it is not known exactly whether genes have more than one function, which creates uncertainty as to whether replacing such genes is indeed desirable.

3. Multigene disorders and effect of environment. Most genetic disorders involve more than one gene. Moreover, most diseases involve the interaction of several genes and the environment. For example, many people with cancer not only inherit the disease gene for the disorder, but may have also failed to inherit specific tumor suppressor genes. Diet, exercise, smoking and other environmental factors may have also contributed to their disease.

4. High costs. Since gene therapy is relatively new and at an experimental stage, it is an expensive treatment to undertake. This explains why current studies are focused on illnesses commonly found in developed countries, where more people can afford to pay for treatment. It may take decades before developing countries can take advantage of this technology.

Human Genome Project

DNA Replication image from the Human Genome Project (HGP)
The Human Genome Project is an initiative of the U.S. Department of Energy (“DOE”) that aims to generate a high-quality reference sequence for the entire human genome and identify all the human genes.

The DOE and its predecessor agencies were assigned by the U.S. Congress to develop new energy resources and technologies and to pursue a deeper understanding of potential health and environmental risks posed by their production and use. In 1986, the DOE announced its Human Genome Initiative. Shortly thereafter, the DOE and National Institutes of Health developed a plan for a joint Human Genome Project (“HGP”), which officially began in 1990.

The HGP was originally planned to last 15 years. However, rapid technological advances and worldwide participation accelerated the completion date to 2005. Already it has enabled gene hunters to pinpoint genes associated with more than 30 disorders.[11]

Cloning
Cloning involves the removal of the nucleus from one cell and its placement in an unfertilized egg cell whose nucleus has either been deactivated or removed.
There are two types of cloning:

1. Reproductive cloning. After a few divisions, the egg cell is placed into a uterus where it is allowed to develop into a fetus that is genetically identical to the donor of the original nucleus.
2. Therapeutic cloning.[12] The egg is placed into a Petri dish where it develops into embryonic stem cells, which have shown potentials for treating several ailments.[13]
In February 1997, cloning became the focus of media attention when Ian Wilmut and his colleagues at the Roslin Institute announced the successful cloning of a sheep, named Dolly, from the mammary glands of an adult female. The cloning of Dolly made it apparent to many that the techniques used to produce her could someday be used to clone human beings.[14] This stirred a lot of controversy because of its ethical implications.

Current Research
In January 2008, Christopher S. Chen made an exciting discovery that could potentially alter the future of medicine. He found that cell signaling that is normally biochemically regulated could be simulated with magnetic nanoparticles attached to a cell surface. The discovery of Donald Ingber, Robert Mannix, and Sanjay Kumar, who found that a nanobead can be attached to a monovalent ligand, and that these compounds can bind to Mast cells without triggering the clustering response, inspired Chen’s research. Usually, when a multivalent ligand attaches to the cell’s receptors, the signal pathway is activated. However, these nanobeads only initiated cell signaling when a magnetic field was applied to the area, thereby causing the nanobeads to cluster. It is important to note that this clustering triggered the cellular response, not merely the force applied to the cell due to the receptor binding. This experiment was carried out several times with time-varying activation cycles. However, there is no reason to suggest that the response time could not be reduced to seconds or even milliseconds. This low response time has exciting applications in the medical field. Currently it takes minutes or hours for a pharmaceutical to affect its environment, and when it does so, the changes are irreversible. With the current research in mind, though, a future of millisecond response times and reversible effects is possible. Imagine being able to treat various allergic responses, colds, and other such ailments almost instantaneously. This future has not yet arrived, however, and further research and testing must be done in this area, but this is an important step in the right direction.[15]

Agriculture

[edit] Improve yield from crops
Using the techniques of modern biotechnology, one or two genes may be transferred to a highly developed crop variety to impart a new character that would increase its yield (30). However, while increases in crop yield are the most obvious applications of modern biotechnology in agriculture, it is also the most difficult one. Current genetic engineering techniques work best for effects that are controlled by a single gene. Many of the genetic characteristics associated with yield (e.g., enhanced growth) are controlled by a large number of genes, each of which has a minimal effect on the overall yield (31). There is, therefore, much scientific work to be done in this area.

Reduced vulnerability of crops to environmental stresses
Crops containing genes that will enable them to withstand biotic and abiotic stresses may be developed. For example, drought and excessively salty soil are two important limiting factors in crop productivity. Biotechnologists are studying plants that can cope with these extreme conditions in the hope of finding the genes that enable them to do so and eventually transferring these genes to the more desirable crops. One of the latest developments is the identification of a plant gene, At-DBF2, from thale cress, a tiny weed that is often used for plant research because it is very easy to grow and its genetic code is well mapped out. When this gene was inserted into tomato and tobacco cells, the cells were able to withstand environmental stresses like salt, drought, cold and heat, far more than ordinary cells. If these preliminary results prove successful in larger trials, then At-DBF2 genes can help in engineering crops that can better withstand harsh environments (32). Researchers have also created transgenic rice plants that are resistant to rice yellow mottle virus (RYMV). In Africa, this virus destroys majority of the rice crops and makes the surviving plants more susceptible to fungal infections (33).

Increased nutritional qualities of food crops
Proteins in foods may be modified to increase their nutritional qualities. Proteins in legumes and cereals may be transformed to provide the amino acids needed by human beings for a balanced diet (34). A good example is the work of Professors Ingo Potrykus and Peter Beyer on the so-called Goldenrice™(discussed below).

Improved taste, texture or appearance of food
Modern biotechnology can be used to slow down the process of spoilage so that fruit can ripen longer on the plant and then be transported to the consumer with a still reasonable shelf life. This improves the taste, texture and appearance of the fruit. More importantly, it could expand the market for farmers in developing countries due to the reduction in spoilage.
The first genetically modified food product was a tomato which was transformed to delay its ripening (35). Researchers in Indonesia, Malaysia, Thailand, Philippines and Vietnam are currently working on delayed-ripening papaya in collaboration with the University of Nottingham and Zeneca (36).

Reduced dependence on fertilizers, pesticides and other agrochemicals
Most of the current commercial applications of modern biotechnology in agriculture are on reducing the dependence of farmers on agrochemicals. For example, Bacillus thuringiensis (Bt) is a soil bacterium that produces a protein with insecticidal qualities. Traditionally, a fermentation process has been used to produce an insecticidal spray from these bacteria. In this form, the Bt toxin occurs as an inactive protoxin, which requires digestion by an insect to be effective. There are several Bt toxins and each one is specific to certain target insects. Crop plants have now been engineered to contain and express the genes for Bt toxin, which they produce in its active form. When a susceptible insect ingests the transgenic crop cultivar expressing the Bt protein, it stops feeding and soon thereafter dies as a result of the Bt toxin binding to its gut wall. Bt corn is now commercially available in a number of countries to control corn borer (a lepidopteran insect), which is otherwise controlled by spraying (a more difficult process).

Crops have also been genetically engineered to acquire tolerance to broad-spectrum herbicide. The lack of cost-effective herbicides with broad-spectrum activity and no crop injury was a consistent limitation in crop weed management. Multiple applications of numerous herbicides were routinely used to control a wide range of weed species detrimental to agronomic crops. Weed management tended to rely on preemergence — that is, herbicide applications were sprayed in response to expected weed infestations rather than in response to actual weeds present. Mechanical cultivation and hand weeding were often necessary to control weeds not controlled by herbicide applications. The introduction of herbicide tolerant crops has the potential of reducing the number of herbicide active ingredients used for weed management, reducing the number of herbicide applications made during a season, and increasing yield due to improved weed management and less crop injury. Transgenic crops that express tolerance to glyphosphate, glufosinate and bromoxynil have been developed. These herbicides can now be sprayed on transgenic crops without inflicting damage on the crops while killing nearby weeds (37).
From 1996 to 2001, herbicide tolerance was the most dominant trait introduced to commercially available transgenic crops, followed by insect resistance. In 2001, herbicide tolerance deployed in soybean, corn and cotton accounted for 77% of the 626,000 square kilometres planted to transgenic crops; Bt crops accounted for 15%; and "stacked genes" for herbicide tolerance and insect resistance used in both cotton and corn accounted for 8% (38).

Production of novel substances in crop plants
Biotechnology is being applied for novel uses other than food. For example, oilseed can be modified to produce fatty acids for detergents, substitute fuels and petrochemicals.[citation needed] Potato, tomato, rice, and other plants have been genetically engineered to produce insulin[citation needed] and certain vaccines. If future clinical trials prove successful, the advantages of edible vaccines would be enormous, especially for developing countries. The transgenic plants may be grown locally and cheaply. Homegrown vaccines would also avoid logistical and economic problems posed by having to transport traditional preparations over long distances and keeping them cold while in transit. And since they are edible, they will not need syringes, which are not only an additional expense in the traditional vaccine preparations but also a source of infections if contaminated.[16] In the case of insulin grown in transgenic plants, it might not be administered as an edible protein, but it could be produced at significantly lower cost than insulin produced in costly, bioreactors.[citation needed]

Criticism
There is another, darker side, many people say, to the agricultural biotechnology issue however. It includes increased herbicide usage and resultant herbicide resistance, "super weeds," residues on and in food crops, genetic contamination of non-GM crops which hurt organic and conventional farmers, damage to wildlife from glyphosate, etc.[2][3]

Biological engineering

Main article: Bioengineering
Biotechnological engineering or biological engineering is a branch of engineering that focuses on biotechnologies and biological science. It includes different disciplines such as biochemical engineering, biomedical engineering, bio-process engineering, biosystem engineering and so on. Because of the novelty of the field, the definition of a bioengineer is still undefined. However, in general it is an integrated approach of fundamental biological sciences and traditional engineering principles.

Bioengineers are often employed to scale up bio processes from the laboratory scale to the manufacturing scale. Moreover, as with most engineers, they often deal with management, economic and legal issues. Since patents and regulation (e.g. FDA regulation in the U.S.) are very important issues for biotech enterprises, bioengineers are often required to have knowledge related to these issues.
The increasing number of biotech enterprises is likely to create a need for bioengineers in the years to come. Many universities throughout the world are now providing programs in bioengineering and biotechnology (as independent programs or specialty programs within more established engineering fields).

Bioremediation and Biodegradation

Main article: Microbial biodegradation

Biotechnology is being used to engineer and adapt organisms especially microorganisms in an effort to find sustainable ways to clean up contaminated environments. The elimination of a wide range of pollutants and wastes from the environment is an absolute requirement to promote a sustainable development of our society with low environmental impact. Biological processes play a major role in the removal of contaminants and biotechnology is taking advantage of the astonishing catabolic versatility of microorganisms to degrade/convert such compounds. New methodological breakthroughs in sequencing, genomics, proteomics, bioinformatics and imaging are producing vast amounts of information. In the field of Environmental Microbiology, genome-based global studies open a new era providing unprecedented in silico views of metabolic and regulatory networks, as well as clues to the evolution of degradation pathways and to the molecular adaptation strategies to changing environmental conditions. Functional genomic and metagenomic approaches are increasing our understanding of the relative importance of different pathways and regulatory networks to carbon flux in particular environments and for particular compounds and they will certainly accelerate the development of bioremediation technologies and biotransformation processes.[17]
Marine environments are especially vulnerable since oil spills of coastal regions and the open sea are poorly containable and mitigation is difficult. In addition to pollution through human activities, millions of tons of petroleum enter the marine environment every year from natural seepages. Despite its toxicity, a considerable fraction of petroleum oil entering marine systems is eliminated by the hydrocarbon-degrading activities of microbial communities, in particular by a remarkable recently discovered group of specialists, the so-called hydrocarbonoclastic bacteria (HCB).[18]

Notable researchers and individuals

Canada : Frederick Banting, Lap-Chee Tsui, Tak Wah Mak, Lorne Babiuk
Europe : Paul Nurse, Jacques Monod, Francis Crick
Finland : Leena Palotie
Iceland : Kari Stefansson
India : Kiran Mazumdar-Shaw (Biocon) Advait Nair, Arun Kumar
Ireland : Timothy O'Brien, Dermot P Kelleher
Mexico : Francisco BolĂ­var Zapata, Luis Herrera-Estrella
U.S. : David Botstein, Craig Venter, Sydney Brenner, Eric Lander, Leroy Hood, Robert Langer, James J. Collins, Henry I. Miller, Roger Beachy, Herbert Boyer, Michael West, Thomas Okarma, James D. Watson

Read More......

Monday, February 4, 2008

article : The Type of Microscope

optical instrument used to increase the apparent size of an object.

Simple Microscopes

A magnifying glass, an ordinary double convex lens having a short focal length, is a simple microscope. The reading lens and hand lens are instruments of this type. When an object is placed nearer such a lens than its principal focus, i.e., within its focal length, an image is produced that is erect and larger than the original object. The image is also virtual; i.e., it cannot be projected on a screen as can a real image.

Compound Microscopes

The compound microscope consists essentially of two or more double convex lenses fixed in the two extremities of a hollow cylinder. The lower lens (nearest to the object) is called the objective; the upper lens (nearest to the eye of the observer), the eyepiece. The cylinder is mounted upright on a screw device, which permits it to be raised or lowered until the object is in focus, i.e., until a clear image is formed. When an object is in focus, a real, inverted image is formed by the lower lens at a point inside the principal focus of the upper lens. This image serves as an "object" for the upper lens which produces another image larger still (but virtual) and visible to the eye of the observer.

Computation of Magnifying Power

The magnifying power of a lens is commonly expressed in diameters. For example, if a lens magnifies an object 5 times, the magnification is said to be 5 diameters, commonly written simply "5x." The total magnification of a compound microscope is computed by multiplying the magnifying power of the objective by the magnifying power of the eyepiece.


Development and Uses

The invention of the microscope is variously accredited to Zacharias Janssen, a Dutch spectaclemaker, c.1590, and to Galileo, who announced his invention in 1610. Others are known for their discoveries made by the use of the instrument and for their new designs and improvements, among them G. B. Amici, Nehemiah Grew, Robert Hooke, Antony van Leeuwenhoek, Marcello Malpighi, and Jan Swammerdam. The compound microscope is widely used in bacteriology, biology, and medicine in the examination of such extremely minute objects as bacteria, other unicellular organisms, and plant and animal cells and tissue—fine optical microscopes are capable of resolving objects as small as 5000 Angstroms. It has been extremely important in the development of the biological sciences and of medicine.

Modified Compound Microscopes

The ultramicroscope is an apparatus consisting essentially of a compound microscope with an arrangement by which the material to be viewed is illuminated by a point of light placed at right angles to the plane of the objective and brought to a focus directly beneath it. This instrument is used especially in the study of Brownian movement in colloidal solutions (see colloid). The phase-contrast microscope, a modification of the compound microscope, makes transparent objects visible; it is used to study living cells. The television microscope uses ultraviolet light. Since this light is not visible, the apparatus is used with a special camera and may be connected with a television receiver on which the objects (e.g., living microorganisms) may be observed in color.

Electron Microscopes

The electron microscope, which is not limited by the powers of optical lenses and light, permits greater magnification and greater depth of focus than the optical microscope and reveals more details of structure. Instead of light rays it employs a stream of electrons controlled by electric or magnetic fields. The image may be thrown on a fluorescent screen or may be photographed. It was first developed in Germany c.1932; James Hillier and Albert Prebus, of Canada, and V. K. Zworykin, of the United States also made notable contributions to its development. The scanning electron microscope, introduced in 1966, gains even greater resolution by reading the response of the subject material rather than the direct reflection of its beam. Using a similar approach, optical scanning microscopes achieve a resolution of 400 Angstroms, less than the wavelength of the light being used. Finally, the scanning tunnelling microscope, invented in 1982, uses not a beam but an electron wave field, which by interacting with a nearby specimen is capable of imaging individual atoms; its resolution is an astounding one Angstrom.

Bibliography

See C. Marmasse, Microscopes and Their Uses (1980).
____________________

The Columbia Encyclopedia, Sixth Edition Copyright© 2004, Columbia University Press. Licensed from Lernout & Hauspie Speech Products N.V. All rights reserved.

Read More......

Search by Google

Custom Search
 

Search Engine Optimization - AddMe

Enter your email address:

Delivered by FeedBurner