Monday, October 13, 2014

Everyday Scope Creep

For this week's assignment, we had to identify a project in which scope creep took place. I have to admit that I'm a little late in posting this because I couldn't think of a situation in which scope creep actually happened to me personally, or a team that I was on. So, I will talk about a kind of "scope creep" that I have witnessed. By the way, for those reading this that aren't in my Walden classes, "scope creep" is what happens when you are working on a project and the scope is either ill-defined or not strictly followed, and the amount of work or nature of work starts to increase and/or change in unexpected ways as a result.

The first example I can think of is a project where the scope is well-articulated but ill-conceived. I was able to come up with this example after reading my friend David Smith's blog post on his home project adventures. I remember an example from when I was young where my grandparents wanted to flatten a side of a hilly patch on their one acre back yard with my mom and I. The hilly patch was about seven to ten feet high, about seven feet wide, and thirty feet long. The plan was to rent a rototiller and manually churn the hill, then shove the mud into wheelbarrows and dump them on some exposed tree roots in the back of the yard. So we had a plan and knew what everyone would be doing. However, the work was backbreaking and grueling. The mud was hard clay and packed in earth, and it wasn't long before the earth was very rocky, and the rototiller was having a tough time getting through the earth. After a very long day-and-a-half of work. At one point, my mom said that she thought this had gone too far, and that she thought the job would be better done with an earth mover, such as a Caterpillar. Grandma disagreed with that idea, perhaps because she already felt committed to making the original plan work. Eventually, she relented and a professional earth mover came out to look at the job. He said it would only take him an hour to finish it, and by the way, the clay we were throwing on the tree roots were going to kill the tree. So it was a good thing we called him. Sometimes the scope can seem just right when in fact the scope was poorly designed to begin with. A little more analysis on the front-end might have made us all realize that the job was too big for us and what we had planned.

My second example is what I'll call "association/affiliation/partnership scope creep. This is when a company is partnered with another company, often a customer/client of some kind, in which the nature of benefits and perks of being a "number one customer" or highly valued customer, are not clearly defined, and the customer keeps nibbling and slowly creeping and ratcheting up the demands on the service-providing company. I've seen the companies in the "number one" spot dictate all kinds of terms to the service providing companies I've worked for, to the point that the service-providing company is making special exceptions to company policy, forcing employees to undergo several extra-invasive security protocols to make the client comfortable, bumping the client to the first in queue for service over several other customers, and pretty much whatever else the customer demands. This seems like the company-equivalent of codependency or unhealthy boundaries, and it can have devastating consequences for both companies: One company ends up bending over backwards and compromising it's standards and corporate identity to appease the customer, and the client develops unrealistic expectations and gets frustrated with the service provider when these expectations are not met. If the associate were looked at like a special type of project, you could create a statement of work for the special customer, and spell out all of the specific benefits the special status or partnership affords them, and also what the limits are--the deal-breaking things the service-providing company cannot or will not do, so don't ask. Agree to certain perks and then stick to them! If the client company wants (or demands) more, then that would require a new contract or statement of work, and both parties would have to agree to the new written terms. The service providing company retains it's identity and stabilizes its work load, and the client is happy because they know what to expect and they're getting it, and they still have a "special status" with the service company.

So those are my examples of "scope creep." They may not be dead on the money, but it at least hits on some scope creep elements...

Thanks for reading!

Friday, October 3, 2014

Resources for PM Budgeting, Scheduling, and More



            Being a project manager can often feel like a daunting task if you focus too much on the entirety of the project. Fortunately, there are an amazing wealth of resources aimed at helping new project managers or experienced project managers looking to update their practices. This week, I did a little hunting around and found a few such resources for your consideration.
            The first is a site called www.brighthubpm.com. Specifically, I took a look at their templates and forms page, found here: http://www.brighthubpm.com/templates-forms/124740-collection-of-free-project-management-templates-and-forms/. I find that forms and templates help me to stay organized and break down large projects into manageable chunks of information, and this site has forms for every step of the project management process. Specifically, it has a link on how to create a RACI or RASCI chart, which Stolovitch mentions in this week’s reference materials. The RASCI (Responsible, Accountable, Support, Consulted, Informed) is a matrix chart that delineates all of the work tasks to be completed on a project, and who is supposed to complete the work and so forth. Each task is assigned to a person or people, and given an estimation as to the duration of hours it will take to compete the task. This can also be used to generate budget considerations, too. Simply generate a cost per hour estimate given the specific people assigned to the tasks, and any and all tools and resources they will need. A word of caution is to remember that what is “needed” should be measured against the approved budget and modified accordingly if necessary. www.brighthubpm.com has a lot of other great templates too, such as when it comes to scheduling and other planning phases…
            One site said that the hardest budget you will ever have to put together is your first one. Budgeting for beginners is difficult because, again there is a lot to consider and some “hidden” or indirect costs can be missed in the budgeting process. This blog post along with this week’s resources, helped me to think of budget as a matter of time as well as money. http://www.techrepublic.com/article/creating-your-project-budget-where-to-begin/ The site’s page also covers common mistakes to avoid in creating the budget. This is helpful because more than half the battle of creating plans, schedules, and budgets are thinking about them in a productive and thorough way. Any resource that helps orient you to a better understanding of the budget is helpful.
            A final resource worth sharing is Cathy Moore’s site. In this blog post, as well as others she talks about how to create value in E-Learning that is also budget-friendly. http://blog.cathy-moore.com/2010/05/how-to-design-elearning-thats-memorable-and-budget-friendly/ Cathy Moore’s blog emphasis on value reminds me that every budgetary consideration should start with the question of value, and how it will increase training effectiveness. By focusing on this emphasis, you begin to realize that you don’t need to develop something you originally thought was essential in the project, or reevaluate the budget plan after assessing the project as a whole. This “meta-budgetary” question can lead to weeding out a great deal of cost-suckers that aren’t really adding value to the training project, and keep the project on-budget.
            What do you find difficult about creating a PM budget? Let me know if you find any of these resources useful in the comments below!

Friday, September 19, 2014

Layers of Communication

 It's a fairly widely known statistic that only about 7-10% of what is communicated in a face-to-face conversation is conveyed by the verbal content and the language itself. An additional 40% of the entire message is communicated in tone of voice, and about another 50% is communicated through body language. Though many of us have heard this, we are not always conscious of this when we are communicating to others at a distance via technology. It is better to show than tell, however, so below is an exercise in which the exact same message is communicated in three different ways: Through email, voice mail, and face-to-face:

http://mym.cdn.laureate-media.com/2dett4d/Walden/EDUC/6145/03/mm/aoc/index.html

Just reading the email alone, it reads as a genuine but urgent request for an ETA on a project that may affect the deadline of a colleague. There is reassurance in the email, in phrases such as "I know you're busy.." but it is hard to tell if this is "true" reassurance and understanding, or merely "couching language," or language used to pad or prep the reader for the upcoming discomfort. What comes across most in this first exchange, to me, was "worry." She really needs this ETA or she'll miss her deadline. She's likely using couching language to mask her desperation...

Then, I listened to the phone message. In the phone message, the reassurance comes across more strongly, and I got less of an impression of worry or urgency, though it is still there. Hearing the actual voice allows certain words to be emphasized. The "I really appreciate your help" comes across strongly. Most of the voice mail sounds as though she's reading from a cue card, which reduces the sense of urgency in the message.

Finally, the face-to-face message was least alarming. The tone of voice sounds even more calm than the voice recording (unfortunately this may be due to the fact that they are different actresses/voices for the voice mail and face-to-face, which is an unfortunate variable in the experiment). In the face-to-face communication, she begins with a polite smile and has a relaxed body pressure. The feeling of worry that I got from the first message does not come across here. Her level of deference makes me want to fulfill her request sooner, rather than feeling obligated to as the email made me feel.

This exercise is an important reminder to choose your method of communication wisely. If you are working in a group, discussing sensitive subjects with more emotional gravitas should probably be left for face-to-face or a Skype call (or Adobe connect, etc.). However, email is expedient and is great for task-oriented messages, so long as the sender remembers that a lot of the intended tone could or will be lost in translation.

For your consideration: Here is another similar experiment that I think has better controls. It concerns "the halo effect" as it relates to attraction. A male actor reads the exact same words that are a sample dating profile to be filmed on a webcam for potential dates to view. In one version, he reads the words with a negative tone of voice (and facial expression to match). In the second version, he reads the words with an upbeat tone. One group of girls watches the first video. A second group of girls watches the second video. The experiment and there reactions can be found here: 


Friday, September 12, 2014

How to Paint a Project into a Corner

Some projects, when you are finished with them, you can take a look back and feel a sense of satisfaction that you accomplished something, and that all of your efforts will pay off in making a situation better for yourself and those around you. And then there are some projects that are marked by a certain, “If only I knew then what I knew now” feeling. These are the projects that are less than enjoyable to take a look back on, but they are often the most instructional if we do take the time to examine them.

For my bachelor’s program, I had to come up with a training module that would resolve a real-world problem that myself or someone I knew was facing, and create the training module. I often have difficulty coming up with these sorts of projects because I am not in a position where a lot of these issues are easy to find. I was not even a graduate assistant at the time—I was and am working at a hotel as a painter. The paint room was incredibly disorganized with over a hundred different paints and stains in the paint room and only a loose organization as to what went where. The paint chart that explained what paints went where was a series of chicken-scratched papers on a clipboard, and it was very difficult to tell what information was most current. Further, I was told that there was an issue with engineers painting the wrong color on the walls—that engineers would just grab a color that looks right and paint a surface with it only to find out that it was the wrong color in the morning. To meet the requirements of my course project, I had to include certain multimedia elements into the project, and this pushed to me to do more than what I felt the issue at hand merited. Perhaps a lot more…

The former painter seemed to believe the problem was simply one of motivation—that people did not keep the area organized by putting things back where they belonged. To help with this, and to clear up the confusion about which paint corresponded with which surface, I created a paint chart catalog with pictures and number callouts that showed the paint that belonged on each surface in the hotel. It was basically a job aid on steroids as well as a way to organize the paint room. I also created a system of labeling the shelves with “warehouse locations,” and the binder with the paint chart showed where each paint was located on the shelf. It’s an alphanumeric coding system that’s used by the military warehouses as well as other warehouses the world over. Throughout most of this project I didn’t really consult with anyone on the details. I mostly saw this project as my prerogative to fix the paint situation.

As a result, it has more or less remained my project. I didn’t really get buy-in from anyone, and now that the project is finished, most of the engineers complain that the location system and the binder is “too complicated.” I tried to make the project scalable in case the organization caught on, so I added more letters and numbers than what was needed for the shelves in the paint room, and most people never bother to take the time to figure out what the location breakdown means. Now that I have been doing the painting detail for a while, I’ve also begun to realize that engineers rarely paint the walls the wrong color. What this turned out to be was one of two things: 1. Someone ordered the wrong sheen of paint or type of paint. For example, the paint was supposed to be Eg-Shel, and someone ordered flat instead, or it was supposed to be the “Mack Creek” color from the “Harmony” Sherwin Williams product line of paint, but they got the “Direct to Metal” Sherwin Williams paint instead, etc. Both of these have an effect on how the paint looks on the walls and makes it look like the “wrong color.” And 2. It is very difficult to mix a gallon of paint to be exactly the same color batch after batch from one gallon to the next. For more reasons than I can name, Sherwin Williams will often give us a paint that is “slightly off,” because they change the formula of their colors or have minor variations in the paint, etc. Also, if the walls haven’t been painted in a long time, the wall’s color can fade, and when applying the new paint it can leave an undesired contrast where the touching up occurred. In both cases, the only real way to fix this is to repaint the entire surface, and make sure that the paint is the right sheen and type to begin with.

In reviewing the questions regarding projects “post-mortem” by Greer (2010), I think there are some beneficial pieces of my project deliverable—the paint chart catalog. However, not getting buy-in from others in advance means that redoing the catalog ordering system is likely in order to make things appear less complex to others who might use the paints, rather than just asking me where something is. There were warning signs of this, and rather than halting the project I moved forward because I was also completing the project for the course grade. Perhaps that was the first  mistake.

References

Greer, M. (2010). The project management minimalist: Just enough PM to rock your projects! Laureate

Education, Inc.: Minneapolis, MN. 

Sunday, September 7, 2014

EDUC 6145 Project Management for Training and ID

Hello everyone. If you'd like to follow me, leave a comment below.

Best,
~Nathaniel

Friday, June 6, 2014

Seven Star-Crossed, Virtually Faultless Movies on Life, Love, and Ocassionally Loss

The Fault in Our Stars. © 2014, 20th Century Fox. Included under "fair use" for criticism, comment.

John Green’s novel adaptation to The Fault in Our Stars is opening today. While I have not seen the movie yet, I have read the book and it was the best thing I’ve read in years. For a story this good, labeling it “Young Adult (YA)” almost sounds like a pejorative, as it immediately conjures up imagery of Twilight, Harry Potter, or Divergent, or any other number of interesting but sloppily written and executed stories for a less-than-discerning audience. The other label, that it is a “teen caner love story” or something to that effect, draws immediate comparisons to the movie A Walk to Remember—a likable but unfortunately saccharine movie panned by critics. The novel, The Fault in Our Stars, if not the movie, defy both of these categories.


Rather than explain why directly, I’d like to share a list of seven other movies that I feel are in the same spirit as The Fault in Our Stars. Some movies paint-by-numbers and follow a winning formula that nearly guarantees its audience a happy ending within its first few minutes. Then there are films that make another sort of pact with their audience, and that is simply to be honest, sometimes ruthlessly so through the filter of fiction, and yet somehow retain its heart. These rare gems of film are among my all-time favorites, and as it happens I’m not alone: All but one of them have over 90% on Rotten Tomatoes and have come out in the last ten years. Instead of ranking these movies, I ordered them in such a way as to weave a narrative of my own…

1.       Once (2006)


How do I describe Once for people who haven’t seen it? If I say it’s a musical, you might think Rogers and Hammerstein, and that’s the furthest thing it is from. And yet, it is a musical, but an organic one. The two characters in the movie are musicians in real life as well as in the story, and the songs occur so naturally that the beginning and end of songs feel like the tides rolling in and out on the beach. This Irish indie film is so minimalist that it doesn’t even bother to name its too lead characters—not that you’d notice if someone didn’t point that out to you. That’s the magic power this movie has—to make the characters and situations feel so real that you’d almost think you were watching a documentary, but a fun and incredibly heart-felt one at that. This story is about a man and woman who have a chance encounter, the intangible but ever-present mutual friend they have in music, and how they change the course of each other’s’ lives.

2.       The Spectacular Now (2013)


Speaking of great minimalist indie films, here’s another. From the writers that brought you the screenplay of 500 Days of Summer comes the movie adaptation of the book by Tim Tharp. This movie features rising star Shailene Woodley (also in The Fault in Our Stars) and Miles Teller in the lead roles. From the first time Woodley’s face appears on-screen, sans makeup, hovering over passed-out-teen-partier Sutter (Teller), it’s clear that The Spectacular Now is set on a much different course than your typical over-produced, glitzy teen movie. Rather, it eschews the gratuitous in exchange for the honest and psychologically bare instead. This is at times sweet, at times disturbing and always unflinching in its look at first love. 
 

      3.       500 Days of Summer (2009)


I mentioned that the writers who adapted The Spectacular Now, Scott Neustadter and Michael H. Weber, also wrote 500 Days of Summer because both movies deserve a spot on this list. Zooey Deschanel and Joseph Gordon Levitt both give excellent performances in this quirky but poignant film that reviews the course of a relationship with two coworkers who work at a greeting card company.  It’s the perfect backdrop for exploring how love can seem like an unstoppable force one moment, and false and trite and crappy the next. In the end, “things” seem to work out, even though the movie warns us in the beginning that they won’t. It’s an irresistibly charming and upbeat story. It also features an enjoyable supporting role from Chloe Grace Moretz as the no-nonsense younger sister to Gordon Levitt’s love-struck character.

      4.       Let Me In (2010)




Chloe Grace Moretz in Let Me In. © 2010, Overture Films. Included under "fair use" for criticism, comment.

Chloe Grace Moretz, aside from a few great standout roles, seems to be type-cast in a lot of horror movie remakes. However, with Let Me In, it really worked in her favor. Let Me In (2010) is a remake of the Swedish film, Let the Right One In (2008), based on the book of the same name translated in English. I’m tempted to say it’s a certain kind of movie in much the same way some would describe Twilight: “It’s a ______ movie.” But I’m not going to do it. I simply refuse, just like I refuse to call Warm Bodies a “zombie” movie. It isn’t. And thank goodness for that. The best way I can describe it is that Let Me In is a children’s romantic horror movie. I’m fully aware these words don’t go together, and that’s the genius of this film. They shouldn’t. This story shouldn’t work. At all. But not only does it work, it works beautifully. A mysterious girl moves in next door to a disturbed and troubled twelve-year-old boy. They form a very unorthodox friendship and more, which has dire consequences for both of them and the people in town. Taking place in the 80’s, it is altogether nostalgic, sentimental, disturbing and gory, all at once. If you like your horror movie served with a little more “brains” than the typical “undead” movie with no discernible pulse, than this is a must see.  

 

5.       Synecdoche, New York (2008)


Not all horror movies have monsters or graphic violence, though, but make no mistake: Synecdoche, New York is not for the faint of heart. With Synecdoche, New York, screenwriter Charlie Kaufman set out to make a movie about all of the real-life fears that plague him, and thus make a horror movie that isn’t a horror movie at all, but a slow creep into all of life’s pitfalls and vagaries that often make life extremely traumatic and challenging. We find ourselves often repeating the same patterns and mistakes in life, and replaying situations in our head until our lives begin to feel like an infinite regress—a Droste effect (“mise en abyme”), as if standing between two mirrors. Such is the case with Caden Cotard, a theatre director who won a MacArthur Award for his rendition of Death of a Salesman. Feeling the pressure to do something truly great, and brutally honest, he gets lost in his own creation: an endless and infinitely repeating simulation of the city build in a warehouse and housed by countless actors. To say that the movie takes on a surreal quality is an understatement, and there is so much meta and subtext to the movie you have to see it twice or more to really fully appreciate it. It is simultaneously one of the most depressing and exhilarating films I have ever watched, and it is also one of my favorites. If you are the kind of person that appreciates the smallest details of a film being imbued with subtle meaning (i.e. such as the character’s last name, “Cotard,” being the name of a syndrome where someone believes they are dead), then this film is for you. This film makes you work for your meal-- It had me ruminating for days and weeks afterward—but if you can stomach it, this and the next movie on the list are screenwriter Charlie Kaufman at his best.

 

6.       Eternal Sunshine of a Spotless Mind (2004)


Another great Charlie Kaufman movie with director Michael Gondry. Like The Fault in Our Stars, the story of how the story gets its name very cleverly hits at its premise with a story of its own. The film’s title is derived from a line in an Alexander Pope poem, “Eloisa to Abelard,” in which Eloisa, so distraught by the dissolution of her relationship with Abelard, wishes him to forget her. And thus enters the premise of this movie, in which there exists a shady work-from-a-white-van company that promises to relieve you of the pain of loss by erasing the specific memories of a former loved one. You’ll wake up and everything will go on as normal, except you’ll have no recollection of the other person whatsoever. The movie takes place in the mind of Joel Barish, played by an unusually sober Jim Carrey in a career best performance, as the erasers try to remove his former love interest (played by Kate Winslet) from his mind. As such, the whole movie takes on a dreamlike state as Barish recounts the relationship as the memories begin floating away…

 

7.       Her (2013)


Spike Jonze has directed a couple of Charlie Kaufman’s screenplays, and was a producer for Synecdoche, New York. But for Her, Spike Jonze wrote and directed the film himself. Kaufman and Jonze’s similarities in themes and how they approach movies is evident in Her. Theodore (Joaquin Phoenix) works for a futuristic company that ghost writes and “hand writes” (which looks a lot like a cursive font-type letter printed from a printer) sentimental letters to the loved ones of their clients. Theodore speaks to his computer, and the words as well as the sentiments within seem to materialize out of thin air. It’s a good set up for the entire film—it’s quirky, sentimental, and buried under an unsettling artifice. On the surface, Her is a story about a guy who falls in love with his advanced new Operating System (Scarlett Johansson). But it’s more than that. It’s a story about how we relate to technology and other people. It’s a story about loneliness, longing and the narratives we create in our own minds. It even hints at this concept of technology gaining self-awareness and even perhaps surpassing us. There’s a lot more to this movie than artificial intelligence. Joaquin Phoenix  and the smoky, often vulnerable voice of Scarlett Johansson will draw you in and make you believe that it’s all too real.


Well, that’s my list. Let me know if you see (or have seen) any of these movies and tell me what you think. What movies make you think, and experience a little “infinity” in your finite hours?
 

Monday, May 12, 2014

Watch Cosmos: A Spacetime Odyssey with Neil deGrasse Tyson!

Cosmos: A Spacetime Odyssey with Neil deGrasse Tyson
Cosmos: A Spacetime Odyssey with host Neil deGrasse Tyson on USA Today: Special Edition

Cosmos: A Spacetime Odyssey
, with host Neil deGrasse Tyson, has brought me so much joy recently. This show is the revitalization and revisioning of Carl Sagan's miniseries by the same name, which was an international phenomenon when it originally aired. This new series is on Hulu now, as well as on FOX at 9/8c and Nat Geo at 10/9c. Every episode is another key to our magical and real universe, using equal parts direct observation and exploratory imagination, art and science to bring to full bare the discoveries of our past and how we came to know them. Using state of the art CGI, animation, and the "ship of the imagination," Tyson takes us on a journey from the farthest reaches of our universe, to the dawn of time as we know it, to the inner workings of the most infinitesimal single celled organisms. The most brilliant scientific breakthroughs are told in story form, with explanations that show the cosmos in its full complexity while making it easy for the average viewer to understand.

Several episodes help to right the travesty of one scientists' name or another, who most of society has long forgotten, yet we owe them such a debt of gratitude that they ought to be a household name. Perhaps, with any luck, you or the twelve-year-old in your life will be thanking Michael Faraday for everything from your microwave to your computer (see Episode 10), or Anne Jump Cannon and Cynthia Payne for unlocking the secrets of the sun (see Episode 7).

This is the best thing on television right now, and if you're a science buff or a sci-fi fan, or just plain curious and imaginative, there's no time to lose. There are only three weeks left, so now is the perfect time to binge watch and catch up! Use this link to start watching:

http://www.hulu.com/cosmos-a-spacetime-odyssey

Also, pick up a special edition of USA Today (pictured above) for an extended look at Cosmos. It's a great read!

Sunday, April 27, 2014

Reflection: The Evolution of Learning Theories, Expanded


This is a final reflection post on a loose series of posts on learning and learning theories (namely behaviorism, cognitivism, constructivism, social learning theories, connectivism, and adult learning), and this time I hope to cover some previous ground about my views on individual learning theories and learning theories as a whole while also adding some new metaphors and framework for thinking about all of the learning theories together as a cohesive whole. In doing this, it is my hope that we can create a more scientific and unified theory of learning as our understanding of biology, neuropsychology/psychology, technology, and sociology advance.

I. Cultural Philosophy of Learning: How the Evolution of Learning Theories Mirror Our Own

            I explained in a previous post that I feel that learning theories have as much to do with the cultural philosophy and spirit of the times in which they were introduced as much as their basis in empirical studies. This actually becomes cyclical: The culture effects our understanding of learning, and the resultant learning theories affect the culture and our methods of teaching and learning, which then affects how we think we learn again adding to theories, and so on…

            To drive the learning theory evolution metaphor home, I will describe and compare each learning theory to a time in our history. The first learning theory on the timeline is behaviorism, and behaviorism draws inspiration from the work of Pavlov and the conditioning of his dogs to salivate at the dinner bell (Standridge, 2001). Behaviorist B.F. Skinner is also well known for his experiment with pigeon’s pressing a level and creating superstitious rituals to increase their chances to get food. Behaviorism, with its emphasis on stimulus-response, also represents the most primal way in which we learn. All of us have inherited a “reptilian” or “old” brain, which still very much learns via stimulus-response. I was recently watching a South Park episode which featured Cesar Milan, a.k.a. “The Dog Whisperer.” The episode features a parody of Cesar Milan’s behaviorist approach towards out-of-control dogs on the budding sociopath on the show, Eric Cartman. (Note on Link: Contains language some may find offensive: https://www.youtube.com/watch?v=Rx_lTgUSyB4). At the end of the clip, we start to see Cartman roll on his back and be submissive. Towards the end of the episode, Cartman transforms into the perfect child, but it is implied he reverts back after his mother does not stick with the behaviorist approach taught by Milan.

Since behaviorism’s popularity tapered off with the introduction of cognitivism, with its focus on internal mental states, which Skinner’s behaviorism completely ignored. This and every learning theory since has led some to believe behaviorism has been abandoned as obsolete. Kerr (2009) argued otherwise and showed how behaviorism and other learning theories evolve with time, and may lose popularity, but not relevance: Philosopher Dan Dennett expanded behaviorism by including internal mental states by explaining “generate-test”—that animals can generate hypotheses and test them in their minds before acting them out. This shows that not only does how we learn evolve with each new learning theory proposed, but also shows that individual learning theories evolve as well. While generate-test behaviorism, according to Dennett, may be the only explanation for learning that does not result in circular reasoning, the rather dull teaching methods of behaviorists left a lot to be desired, both then and now, and do not embrace the widest spectrum of the myriad of ways in which learning can take place.

            Enter cognitivism. Cognitivism, like behaviorism, tries to take an empirical approach to how we learn, but focuses on the internal mental states which behaviorism largely neglected. So if behaviorism represents our evolutionary behavior inherited from our special ancestor, cognitivism, with its metaphor of the “mind as a computer” resembles the rationalist approach of the Intellectual Enlightenment. Cognitivism emphasizes problem-solving in a linear and logical manner (Ormrod, Schunk, & Gredler, 2009). Its emphasis on thought and self-monitoring seems to echo the philosophy of self-governance that was prevalent in the Enlightenment era.   

            Constructivism, however, seemed to reject the heavy emphasis on observable and objective reality of behaviorism or the linear rationalist approach of cognitivism. Constructivism purports that the learner constructs his or her own meaning based on past learning and beliefs, and that what is true is relative to each learner. Situated cognition, or place-specific learning, is also a prevalent idea in constructivism (Ormrod, Schunk, & Gredler, 2009). This is very much like the painting of the Romantic era, which often had an emphasis on place and nature, and often had hidden symbols and meaning in the work, giving it a very personal and spiritual/subjective feel. In this way, constructivism follows and rejects cognitivism in similar ways as the Romantic era follows and rejects the Enlightenment and early Industrial Revolution thinking.

            The Industrial Revolution had an incredible change on the impact of population growth, human understanding, and the planet at large. The next great revolutions to come about were globalization and the electronic/Internet revolution. Globalization and the shrinking of our social world have given rise to social learning theories, which emphasize the effect our communities and groups have on our learning. As cultures and people within groups clash, that arguing process becomes internalized in the learner, which informs their perspective on a viewpoint (Laureate Education, Inc., n.d.).

The Electronic Revolution and the Internet, in combination with social learning, has led to one of the most recent learning theories, connectivism, which emphasizes that learning relies upon the depth and richness of one’s social and technological networks (Davis, Edmunds, & Kelly-Bateman, 2008). In terms of today’s culture, I find that connectivism is the most “readily accessible” and accurate explanation of the learning activities that we engage in amid contemporary culture. However, I fully acknowledge that this is my bias as someone who is Gen Y and living in a technically advanced Western society in the 21st century. This learning theory is not easily “backward compatible” with how we learned centuries ago, but it does explain how our learning has evolved, and why a new learning theory needed to emerge with it.

             Finally, all of these technological and global changes have given rise to unprecedented competition and made our lives busier and more complex, not simpler. This means that adult learner has more responsibilities and therefore needs to be efficient with their learning. Many adult learning strategies focus on their individual needs and offers individualized support to them. The adult learner needs to be given choices and shown that the thing they are learning is relevant to their changing social and occupational roles (Conlan, Grabowski, & Smith, 2003). This is the individualized consumer model applied to learning  theory. In other words, it’s the Burger King/Starbucks “Have it your way” effect. The many principles of adult learning can be applied to children as well.

 

II. Extended Conclusions: Falling in Love with Learning Theory Pluralism, and Not Missing the Rings for the Tree

            So there is an overview of the learning theories, and a timeline metaphor for how one theory evolved into the next just as we evolved genetically and memetically. This is by no means a perfect metaphor, or meant to insinuate that the time periods I likened the learning theories to directly inspired them (except for the last few theories from social learning, connectivism, and adult learning, as the periods I chose actually coincide with the changes that inspired the theories). Again, some assume that because the newer theories came later, they somehow are automatically better or replace the older learning theories. I like to think of it as annual rings on a tree. Each learning theory is a new ring or chapter in our history, and each new theory that comes along is placed “on top of” the existing rings, rather than replacing them. There is a way for the theories to be used in tandem with each other.

What I’m suggesting is not new, but is a sort of “learning theory pluralism” in the same way that some people refer to themselves as religious pluralists, or psychologists or philosophers who draw from more than one school of thought. We should use bits and pieces of the best parts of each theory as it suits each situation. The only problem with this is that our learning theories (or most of them at least) endeavor to be empirically-based, and several aspects of each theory seem to contradict another. As with pluralism, if each proverbial blind learning theorist is feeling a different part of the elephant, the goal is to feel every part, communicate, and eventually “see” the entire elephant for what it is. In order for learning theories to rival the validity of scientific theories such as Darwinian evolution or the Theory of Relativity, I believe that the learning theories should be further expanded and combined in to one cogent, interdisciplinary theory. The reasoning for at least attempting this is simple—if the differing theories all have valid points, the valid points, if based on empiricism, should at some point be agreed upon and compiled. If they can be compiled without contradiction, you have a more powerful theory. If not, you may need to go back and find where there are still conflicts or discrepancies. This might lead to a better understanding of what pieces of learning theories do not work as well. This is the same process that is being attempted now with Quantum Mechanics theory subsuming the Theory of Relativity. We believe we have a handle on the macro, and now we are attempting to understand the micro, and the two theories seem to contradict each other but both be true? How is this possible? The attempt to understand this will lead to greater discover, as it will with merging learning theories into one cohesive and comprehensive narrative.

Perhaps this narrative sounds a little like falling in love with your soul mate, which requires one to be “firing on all cylindars.” You need to be connecting with your other in a primal, perhaps visceral way (i.e. behaviorism—do you respond to each other in an instinctive and precognitive way?), and be aligned socially (do you share similar friends and social learning networks?), intellectually (cognitive and cognitivism), in a constructivist/spiritual-psychological (do you share similar subjective experiences and construct similar meanings in the world?), and in a connectivist way (do you integrate your technological world and social world in compatible ways?), and in terms of adult learning (do you share similar adult responsibilities and learn from your real world problems so you can solve your real world problems, career and kids, finances, etc.?). Again, this is a metaphor, but perhaps the imagination used in producing such a metaphor will also spark new insights into how we can both further the study of each learning theory on its own as well as how we can merge the theories into a consistent scientific narrative. If each learning theory is an annual ring on one whole tree, the goal is not to miss the tree for the rings (i.e. the forest for the trees).  

 
References

Conlan, J., Grabowski, S., & Smith, K. (2003). Adult learning. In M. Orey (Ed.), Emerging

perspectives on learning, teaching, and technology. Retrieved from http://projects.coe.uga.edu/epltt/index.php?title=Adult_Learning

Davis, C., Edmunds, E., & Kelly-Bateman, V. (2008). Connectivism. In M. Orey (Ed.),

Emerging perspectives on learning, teaching, and technology. Retrieved from http://projects.coe.uga.edu/epltt/index.php?title=Connectivism

Kerr, B.  (2007, January 1).  _isms as filter, not blinker  [Blog post].  Retrieve from

http://billkerr2.blogspot.com/2007/01/isms-as-filter-not-blinker.html

Laureate Education, Inc. (n.d.). “Theory of social cognitive development.”  Retrieved from

https://class.waldenu.edu/webapps/portal/frameset.jsp?tab_tab_group_id=_2_1&url=%2Fwebapps%2Fblackboard%2Fexecute%2Flauncher%3Ftype%3DCourse%26id%3D_4198570_1%26url%3D#global-nav-flyout.

Ormrod, J., Schunk, D., & Gredler, M. (2009). Learning theories and instruction (Laureate

custom edition). New York: Pearson.

Standridge, M. (2001). Behaviorism. In M. Orey (Ed.), Emerging perspectives on learning,

teaching, and technology. Retrieved from http://projects.coe.uga.edu/epltt/index.php?title=Behaviorism

Monday, April 21, 2014

Thoughts on Learning Theories and Their Continued Advancement


For all our advancements in the understanding of the human brain, it is still the ultimate Rube Goldberg machine enshrouded in a Black Box—we glimpse evidence of cogs turning, things buzzing and clunking, but a diversity of theories still abound as to how learning takes place between “input” and “output.” I think as with contemporary psychology, most psychologists draw from many schools of thought, rather than being a “pure” neo-Freudian or Carl Roger’s Humanist approach. Likewise, I think most instructors and instructional designers use their understanding of behaviorism, cognitivism, constructivism, social learning and adult learning. At the beginning of the class, I gave the example of how learning a language might be approached from the behaviorist, cognitivist and constuctivist/social perspectives. The behaviorist might run flash card drills where the social constructivist would have students speak in groups. But don’t we all do this? Don’t we draw upon the different learning theories as tools, frameworks for understanding, or different “hats” that we wear when we instruct others? Most language classes seem to draw upon various learning theories quite naturally in the variety of approaches it takes to the same subject in the same class. The same also goes with technology and learning techniques on the job. As far as learning styles go, the same principles apply—research has only verified two general leaning styles—visual and auditory/verbal; both can be easily satisfied by a learning segment that has both audio and visuals (Laureate Education, Inc., n.d., “Strategies”). I also went on record by saying that I even find this to be a somewhat dubious claim. Barring blind people, if you are a human being, about 80% of the information you take in is through your eyes. We as an entire species favor sight over hearing out of the five senses just as dogs prefer smell over visuals and sound.

This isn’t to say that I don’t think anything can be proven or said definitively about learning styles and how we learn, but it is to say that I believe learning theories are more philosophically and culturally based than as evidentiary and scientifically-based as we would like to believe, which can be a limiting factor in their ultimate determination in how we learn as well as how and why to directly apply each of the learning theories for a given situation. However, here are two ways in which my thinking on learning theories has become more nuanced, even though my overall view of them has stayed approximately the same.

1.                            New learning theories and updating of older learning theories has to change with culture and new evidentiary understanding, because culture affects and changes how we learn. Our world and the tools we use have changed vastly since Watson and Skinner first proposed behaviorism. Some behaviorist proponents said that if your child cried out for attention, you shouldn’t give them attention or swaddle them because it would only encourage the negative attention-getting behavior. We shudder to think of doing this now. Today’s ideal parents are soccer moms and helicopter parents. We also didn’t have the Internet in the 1940s and ‘50s, so perhaps blackboards and flash cards made the most sense at that time. Whereas today’s culture has spawned connectivism, which purports that learning is culled from an emerging pattern or narrative that is derived from various sources within the learner’s technological and social networks (Davis, Edmunds, & Kelly-Bateman, 2008). As a Gen-Y who uses the Internet for practically everything, connectivism certainly feels relevant to me. Adult learning has also become popular at least in corporate settings because it places a greater emphasis on direct applicability to the job or changing social role as an adult. Social learning theories likewise reflect our cultures increased value placed on interdependence. Things are changing at such a fast pace it is important for adults to adopt lifelong learning. The tools have changed, which has changed the culture, so it only makes sense that new learning theories evolve and expand too in order to account for the new methods we use to learn and approach the world.

2.                            And 2. Despite whatever holes learning theories have, an educated guess at good pedagogy practices is better than flat-out trial-and-error, corporate fads, and the whims and intuitions of managers and the well-meaning but ill-informed peanut gallery. Everyone has a “hunch” or opinion about how they themselves and often everyone else learn best, but there are far fewer evidence-based theories of learning. If nothing else, the learning theories are a great starting point for creating an instructional material as well as a way of “getting back to basics” if a project is getting away from itself. Corporate fads can also drive training initiatives if one lets them. I can’t count the number of times I have heard “the cloud” or “big data” without the person really knowing what that means. Learning theories provide the proper place for technology so that people don’t get wrapped up in technology for its own sake.

Conclusions and other final thoughts: Learning theories are at their best when used in tandem because different learning theories emphasize different aspects of learning, which correlate best with specific learning tasks. By using the learning theories together, one theory helps cover most of the holes in the other theories (i.e. some describe learning rather than how learning works, one leaves internal cognitive processes out of the picture entirely, and another overstates how relative learning is, etc.). Technology affects culture, and both affect the ways in which we learn, which in turn are reflected by our changing learning theories. In order for learning theories to have the validity of other scientific theories, however (such as the theory of gravity, quantum mechanics, or Darwinian evolution via natural selection), it would seem that learning theories would have to function well outside of the changing cultural climate to provide a more lastingly relevant and comprehensive theory of learning. Mapping the neural pathways of the brain, for instance, will create a sort of “neo-cognitivism.” In time, other aspects of learning theories will also become more comprehensive. I believe the goal is to subsume the most accurate information from each of the existing learning theories into one comprehensive and testable theory. Until that time, learning theories are a great launching point, and a solid anchor to propel and ground instruction. But none of the theories, even taken together, should be accepted as “gospel truth.”

 

References:

Conlan, J., Grabowski, S., & Smith, K. (2003). Adult learning. In M. Orey (Ed.), Emerging

perspectives on learning, teaching, and technology. Retrieved from http://projects.coe.uga.edu/epltt/index.php?title=Adult_Learning

Davis, C., Edmunds, E., & Kelly-Bateman, V. (2008). Connectivism. In M. Orey (Ed.),

Emerging perspectives on learning, teaching, and technology. Retrieved from http://projects.coe.uga.edu/epltt/index.php?title=Connectivism

Laureate Education, Inc. (n.d.). “Learning strategies and styles.”  Retrieved from

https://class.waldenu.edu/webapps/portal/frameset.jsp?tab_tab_group_id=_2_1&url=%2Fwebapps%2Fblackboard%2Fexecute%2Flauncher%3Ftype%3DCourse%26id%3D_4198570_1%26url%3D#global-nav-flyout.

Tuesday, April 8, 2014

Reflections on Time, a Life of Meaning, and Learning


Now more than ever, I’m feeling the enormous press of time, which marches in a singular direction. At odd moments, when we are restless or bored or waiting, we become aware of the invisible metronome that ticks away the beats of our unknown minutes, hours, days, years. I have been listening to it more than I think I ought to.

I think of our life similar to that of the experience of looking through one of those tourist binoculars you have to put a quarter into to view the sights around you (as seen on the cover of Bill Bryson’s book, I’m A Stranger Here Myself).

Binoculars used by tourists (minus the stars and smiley face) as seen on the cover of Bill Bryson's I'm a Stranger Here Myself. A book I recommend for its levity as much as its insightfulness.

We are likely on top of a tower or mountain, or at the edge of a pier—but always on the edge of something—and we place our quarter into the slot. We enjoy the sights and look around, getting our fill of some mundane details writ large, but hopefully a long view of the best of what’s around. And then, unexpectedly, suddenly, the shutter falls, and everything goes black. You know it’s coming but it always comes as a surprise, simply because no one is counting the time when they are looking at something amazing, and most people if not all haven’t a clue what the timer is set to. In a world of infinite quarters, this might not be a big deal, but in an existence where there is only one quarter, one life to spend as best as one can, it makes all the difference. A careful observer might witness the tourist ahead of him jerk his head back in surprise as the shudder falls, and hear him mutter, “Well, I guess that was it…” So the second person in line holds his quarter and steps up to the bifocal lens, determined to make the most of his time by immediately focusing on that which he is most eager to see, so that when his time is almost up he is satisfied that it was meaningful.

Even for those that believe there is “life after life,” nearly everyone agrees that this is very likely to be the only life we get here as “us,” as we currently are. Many still make the same assumption that Descartes made hundreds of years ago, that there is “mind” and there is “body,” and that the two are fundamentally separate from each other. This naturally leads to the assumption that the “mind” can exist apart from the brain or the body.  Perhaps you believe we come back as a flower (a mindful one?), or a duck, or a harp player in the cosmos somewhere. I don’t think there is sufficient reason to believe that any of that is true, and even if it was, our “quarter,” our live, is a specific kind of currency that can only be spent once while we are here. We may shudder at the thought but the shutter closes upon us all, and that is that.

So, this again brings us back to a question of how to get the most meaning out of how we spend what we have. I brought up the mind and the brain just now because I believe it gets to the core of this question. We used to measure death by when the heart stops beating, and in many cases we still use this indicator, but some people are on life support after their heart dies and are therefore not dead. Perhaps a better indicator is when we are completely brain dead, but there are cases when people are “brain dead” but still alive, and even show some brain activity (https://www.youtube.com/watch?v=vEuh6tDidUw). However, are either of these stays what we think of when we think of “living”? Most people would make a distinction between “existing” or even “conscious” and “living” in the full sense of the word. I’ll get right to what I think makes the difference: We measure the quality of our lives, in a large part, by what we see, experience, and learn. Unless we are physically blind, 80% of what we learn is through what we see, hence the binocular metaphor for life. Our experiences help us learn and see things differently, and this learning in turn adds greater depth to our future experiences. It is a continual cycle. What we learn, and what we experience is what generates meaning in our lives, and that meaning is made greater by the people that we share it with. I believe this is why I ultimately want to be and continue to become an instructional designer. Because learning and teaching to others through great experiences is the most direct path we have to a meaningful life, and it is a kind of work that can outlast us.

And by this, we come to the only life I believe we will have after our life is done—we live on in the memories, learning, and experiences of others. What we do and what we see and what knowledge we acquire have the ability to outlast us1 (see footnote). So much human activity, specifically in the information and Internet age, is dedicated to maintaining this store of experiences and information, and rightly so, as our progress of any kind is contingent upon it.

I began by talking about my hyperawareness of time as of late, and how that’s made me think of how best I wish to use it. Socrates said that “The unexamined life is not worth living.” Perhaps it is also true that a life examined and scrutinized too closely is not worth living either, being too tedious and even painful. To an extent, in order to function in life we have to take some things for granted to an extent, such as our time. Perhaps there is a happy medium, where we are aware of our limited time enough so as to make the most of it, but not so hyperaware as to be paralyzed by the fear of having wasted it. I think that’s an idea that Socrates and Aristotle would embrace, for what it’s worth. While I’m here, and while I have my quarter in the slot, my aim is to focus on living a life of learning and passing that on to others, as best I can. To me, this is the life I see that matters.

 

1.      Footnote: Our learning and information can survive individual humans, but it will not survive humanity, so this is not a play at “immortality” by any stretch of the imagination, especially when you consider information’s “half-life,” and that most information in our own lifetimes will become lost or irrelevant. If humans were to die off, our book paper would last hundreds of years, but the words printed on it would be washed away or made illegible as the pages turn to carbon. Our batteries will corrode, the hard drives and routers would fail, and there would be no electricity to run it all anyway. Even if beings rivaling or surpassing our intelligence from far away discovered what was left behind, very little of our “living” as we know it would be recoverable, save a few buildings, empty shells of our former existence...