B. F. Skinner

B. F. Skinner

Introduction

B. F. Skinner was an American author, psychologist, researcher, philosopher, and inventor. He is well recognized for his contributions to behaviorism and his scientific method of researching human behavior. Skinner believed free will is an illusion and that any human behavior is learned through conditioning. According to the American Psychological Association, Skinner is the eminent psychologist of the 20th century.

The Childhood of B. F. Skinner

On March 20, 1904, Burrhus Frederic Skinner was born in Susquehanna, Pennsylvania. The small railroad and coal town of Susquehanna was situated in the hills. William Skinner and Grace were Skinner's parents. He was 2 years older than his brother Edward. Skinner's father was a lawyer. William Skinner built a large library in his house and made numerous book purchases. Grace, Skinner's mother, was a housewife. Skinner was brought up in a religious family and grew up as a Presbyterian. He called his house a "warm and stable" place.

Skinner spent a lot of his childhood exploring the Susquehanna hills in the great outdoors. He was an energetic and active little child who loved building things. He once built a cart and unintentionally put the steering backward. He also attempted to construct a perpetual motion machine but was unsuccessful. He did, however, succeed in creating a wide range of other devices, including roller-skate scooters, rafts, sleds, slides, merry-go-rounds, water pistols, blow guns, slingshots, bows and arrows, and a cabin in the woods.

Skinner went to the same high school as his father and mother. He was in a jazz band in school and used to play the piano and saxophone at home. Miss Mary Graves, who taught him art and English, was his most influential teacher. Her advice probably contributed to Skinner's happiness with his high school experience and decision to major in English literature in college. He later dedicated his book "The Technology of Teaching" to her.

Skinner went to Presbyterian Sunday school every week, as was the family tradition. Miss Graves, a devoted Christian, also led these religious classes. Miss Graves had a more liberal view of the bible than Skinner's grandmother, who had a strict, rigid approach to religion. Initially, Skinner enjoyed Miss Graves's and his grandmother's divergent opinions, but as he grew older, he lost interest in religion. He went up to Miss Graves one day and told her he no longer believed in God.

While things were generally good in the Skinner household, Skinner and his parents suffered a terrible loss during his adolescence. Edward, the younger brother of Skinner, passed away at the age of sixteen due to a cerebral hemorrhage. When Edward died, his parents focused more on Skinner because Edward and his parents were closer. Skinner was not always at ease with the increased attention, even though he loved his parents.

Educational Background

Following his high school graduation, Skinner attended Hamilton College, located in New York. Majoring in English literature was his aim. But Skinner did not mesh well with Hamilton's college population. Since subjects like mathematics, anatomy, embryology, and biology had nothing to do with his major, he felt that taking them was ridiculous. He was not a big fan of parties or college football. He also didn't appreciate having to go to church every day because he was an atheist.

In 1926, Skinner received a bachelor's degree in English literature from Hamilton College. He decided to go home and pursue a career as a writer just before he graduated from high school. He settled on writing short stories after struggling to write a captivating novel. But during the next year, Skinner only managed to produce a few brief newspaper articles. He thought that he had "nothing important to say" because he didn't have the viewpoint and life lessons necessary to become a good writer.

As a New York bookshop worker, Skinner thought of writing science fiction rather than fiction after reading Bertrand Russell's Philosophy. This book introduced behaviorism to Skinner and highlighted the findings of John B. Watson's research. He then read a piece about Ivan Pavlov's work that H.G. Wells wrote. Skinner found it fascinating. In 1928, he applied to Harvard University to study psychology and was accepted.

In 1930, Skinner graduated with a master's degree in psychology. He received his PhD in psychology one year later. After receiving multiple fellowships, Skinner was able to carry out study at Harvard University until 1936.

One of Skinner's greatest inspirations was Ivan Pavlov. Skinner embraced Pavlov's theory that "you can see the behavioral order if you have control over the environment." Skinner conducted most of his studies on rats or pigeons using animals. For his research, he created a variety of instruments. The most well-known among them was the "Skinner box." Radical behaviorism is the term for the kind of behaviorism that Skinner developed over time.

Skinner agreed to become a professor at the University of Minnesota in Minneapolis in 1936. Many of the studies he had begun at Harvard were put on hold during this period. Skinner was keen to help during the Second World War and tried to train pigeons to help guide missiles to enemy ships. Eventually, with the development of radar, the project was abandoned.

In 1945, Skinner relocated to Indiana University, where he headed the psychology department. But in 1948, Skinner returned to Harvard as a tenured professor. He remained at Harvard for the rest of his career.

Personal Life

In 1936, Skinner wed Yvonne Blue. Deborah and Julie were their two daughters. Skinner passed away on August 18, 1990, due to leukemia. Ten days before his death, he received the American Psychological Association's Lifetime Achievement Award and gave a speech based on the article he was writing at the time. On the day of his death, he finished writing his last article.

B. F. Skinner

The Accomplishments of Skinner in Radical Behaviorism

Skinner was a strong supporter of psychology's behaviorist movement. He did, however, diverge from the behaviorism that the movement's founder, John B. Watson, had promoted. Watson thought that overt, observable behaviors should be the sole thing psychology focused on. He maintained that since private occurrences (such as ideas, feelings, and perceptions) cannot be immediately witnessed or objectively investigated, they are inappropriate subjects for study.

Although Skinner accepted that the main focus of psychology should be on observable behaviors, he did not reject the importance of interior experiences. He thought that a scientific investigation of behavior might also incorporate private events. Such incidents, however, ought to be seen as behaviors that require explanation rather than as explanations for actions. He made the argument that behavior, both internal and external, is ultimately determined by the environment. Radical behaviorism is the term used to describe Skinner's approach to behavior analysis.

The significance given to stimulus-response (S-R) interactions is another way that traditional and radical behaviorism differs from one another. According to classical behaviorists such as Watson and Pavlov, all actions are reactions to stimuli that came before them. Skinner held a different opinion. He maintained that although the stimulus-response theory can explain reflexive behaviors, it is unable to describe more complicated behavioral patterns. According to his theory, the results of such actions define the behavior. B.F. Skinner's Operant Conditioning theory is based on this belief.

What Is Operant Conditioning?

Operant conditioning is a type of learning in which the results of an action affect the probability that the same action will be taken in the future. Reinforcement and punishment are the two categories of outcomes that Skinner described. Any result that makes a behavior more likely to occur again is called reinforcement, and any consequence that makes it less likely to occur is called punishment.

Numerous animal experiments were carried out by Skinner in order to test his operant conditioning theory. Numerous of these experiments made use of an enclosed space called the "Skinner box." In one such experiment, a ravenous lab rat is put inside a Skinner box that has a lever. The lever delivers food pellets to the rat when it is pressed. As the rat gets to know its new surroundings, its behavior is erratic at first. The rat quickly alters its behavior if it happens to press the lever and get a food pellet. The rat is reinforced by the food, which makes it purposefully press the lever more often.

This type of learning is called operant conditioning by Skinner because the organism actively modifies its surroundings to produce an outcome. In contrast, a behavior in stimulus-response learning is prompted passively by the stimulus that came before it. Through the process of operant conditioning, an organism consciously decides to act in a certain way, and the subsequent consequences shape that behavior.

Each sort of consequence was further divided by Skinner into positive and negative types. Here, "negative" denotes the removal of a stimulus, whereas "positive" means the insertion of a stimulus after the behavior. (Positive or negative) reinforcement always makes behavior stronger, while (negative or positive) punishment always makes behavior weaker.

  • ositive Reinforcement: When a child behaves well, a rewarding stimulus is provided (for example, a baby will smile at you when you play peek-a-boo with them, or they will get a lollipop when they clean their room).
  • Negative Reinforcement: When you engage in a behavior, an adverse stimulus is taken away (your roommate will no longer knock on your door if you turn down the radio, or the obnoxious beeping sound will cease when you fasten your seatbelt).
  • Positive Punishment: When a conduct results in an additional negative input (such as getting a ticket for speeding or receiving a failing grade on an exam), the behavior is reinforced.
  • Negative Punishment: When a behavior is repeated, a positive stimulus is taken away (e.g., your date stops smiling because you burp loudly at dinner, or Nora stays out past curfew, and her dad takes away her cell phone).

In what ways is Skinner's theory applied today?

There are numerous practical uses for operant conditioning theory. These consist of:

Programs for Changing Behavior: These initiatives aim to reduce or eliminate negative behaviors while promoting positive ones. A lot of behavior modification approaches are grounded in the theories of operant conditioning. One of these strategies, called a token economy, gives tokens to participants in exchange for proper behavior. Then, the individual can swap the tokens (such as coins, gold stars, or points) for goods or advantages that reinforce them. Token economies are useful in homes and establishments like mental health clinics, jails, and schools.

Animal Training: To teach complex tricks, animal trainers usually utilize the shaping technique. Through the process of rewarding answers that progressively approach the desired action, trainers have been able to teach animals intricate movements and feats that would not have been feasible otherwise.

Biofeedback Training: Biofeedback has been applied to chronic pain and anxiety disorders. People can change their involuntary body responses, such as heart rate, blood pressure, and muscle tension, by learning strategies like deep breathing and muscle relaxation. Recording devices monitor physiological changes when they perform these behaviors and send the data to them. Good things like a dropped blood pressure reading serve to encourage the negative habits that came before them.

Superstitions: A lot of superstitions come from unintentional reinforcement. Consider a gambler who, right before a significant win, happens to blow on his dice. He will probably continue to blow on the dice because that habit has been encouraged, even if it has nothing to do with his performance.

Addiction: Drug and alcohol addictions can be explained by the reinforcing effects of these drugs. Addictive drugs affect the reward system in the brain, causing relief from pain, anxiety, and discomfort (negative reinforcement) as well as pleasurable feelings (positive reinforcement). As a result, the person is encouraged to consume drugs repeatedly. The principles of operant conditioning also explain behavioral addictions like gambling.

Issues with Skinner's Theory

Skinner carried out a lot of research to back up his theory about human nature. Nevertheless, the majority of these investigations were carried out in lab settings using tiny animals like pigeons and rats. Critics contend that because people are far more complicated than animals, generalizations about their behavior cannot be drawn from these findings. Skinner's method is criticized as being unduly simplistic by humanistic psychologists in particular because it ignores traits that are specific to humans, such as free will.

More criticism has been leveled against Skinner for disregarding the emotional and cognitive components of learning. Some claim that Skinner's methodology encourages a mechanical interpretation of human nature, viewing people as helpless victims of their own choices. Studies have demonstrated that learning can occur without reward or punishment, contrary to what Skinner initially thought. Additionally, behaviors can be picked up through understanding and observation.

Later Years

Skinner became interested in philosophical and moral questions as a result of his worries about how behavioral research would affect society as a whole. He made several television appearances after publishing Contingencies of Reinforcement in 1969 and Beyond Freedom and Dignity two years later. Nevertheless, he wrote About Behaviorism (1974) in response to the ignorance and misunderstanding of his work. He continued to work until the end of his life. He authored three autobiographical volumes: A Matter of Consequences, The Shaping of a Behaviorist, and Particulars of My Life, in addition to professional papers. He was given a leukemia diagnosis in 1989, yet he continued to be active as long as his growing weakness permitted. Ten days before he passed away, he spoke in front of a packed auditorium at the American Psychological Association. The day he passed away, August 18, 1990, he completed the piece that served as the basis for that discussion.


Next TopicBhumi pednekar