Temptation by Emma Patterson

Imagine sitting in a classroom and participating in a debate. Everyone has a pencil and paper out on their desk, and, while they scribble down notes every few minutes, their focus is primarily on actively participating in the discussion and following all of the lines of reason being drawn. Freeze that frame in your mind. Replace the pencils and paper with laptops and tablets; now, instead of hearing voices of your peers command the room, hear the clicks of people typing on a keyboard paired with occasional murmurs of an idea or a teacher asking, “does anyone have any thoughts at all? The engaging environment has been eliminated, and in its wake is an awkwardly silent, disconnected group of people waiting out a forty-five minute period.

I have sat in far too many classrooms in which there have been walls between my classmates and me created by technology.

In our freshman year, we were required to have iPads to do our work. Now we have graduated to laptops, but the result is still the same: no one is ever completely on task. Personal technology is a distraction to an otherwise creative and highly stimulating environment that supports creativity and independence, and it keeps students from retaining the lessons.

It seems dramatic, but it’s all very simple. When a student gets bored of the lesson or tired of paying close attention, they stop listening and start noticing the red circles appearing on their apps notifying them that someone else is trying to talk to you, from there they ask themselves who it is, what they need, whether or not it is important, and then it is almost impossible to keep themselves from clicking on their messages app and starting up a new, probably more interesting, conversation, and, once they stop paying attention, they will not hear the rest of the teacher’s lesson. On EdTech, a website dedicated to researching technology’s role in education, the technology available in classrooms stretches from interactive whiteboards to cell phones1.

There is an important distinction to be made: educational technology is technology used by a teacher to enhance a lesson using some sort of media supplement, while personal technology, in the classroom, is used by a student with the idea that they will use it for educational purposes by the teacher’s direction.

A classroom is a place designed to foster a student’s ability to problem solve, develop their character, and ask questions, and that cannot happen authentically when they have the ability to Google answers that they could’ve gotten to on their own. Technology has a tendency to enhance laziness more than it does education. The social aspect of a school has been turned on its head since the introduction of social media.  This has created a culture of constant negativity and cyberbullying. As I stated earlier, overcoming the instinct to click on the notification button is impossible, and, when a student is experiencing bullying, the student will never feel safe, as they are tortured every moment they have their devices on them. This creates hostility that stretches into every moment in a classroom; school can no longer be a haven in which students are given time to be distracted from negativity and pressure by being allowed to channel their energy into creativity, growth, and knowledge. Finally, technology gives us the illusion of multitasking. Students claim they can watch TV, do math homework, and hold four conversations on 3 different platforms; in reality, this is not the case. Instead of learning to give their attention to one task at a time to ensure it is done to the best of their ability, students learn to prioritize efficiency over quality. They believe that they have no cracks in a perfect system, but really they are losing the ability to retain information and extend their attention span.

Technology addictions are classified as a legitimate addictions by several research institutions, including Stanford University. As discussed in the CBS News article “Internet addiction changes brain similar to cocaine: study”, which was published in January of 2012, technology affects our emotions, decision-making, and self-control. People lose the ability to connect with their peers and loved ones. When that dynamic is brought into a classroom, students with underdeveloped brains do not have the ability to overpower their dependence on technology. Instead of wanting to connect with the diverse environment of a classroom, they want to check the likes on their latest Instagram post. They’d prefer looking at relatable memes to participating in a discussion on American foreign policy. As a student, I can say that, as soon as I get off track on my computer in a class, it is incredibly rare that I will end up getting back on track before class finishes, and then it begins a cycle. After a student stops paying attention, they fall behind, so they quickly get frustrated, and stop paying attention, which only leads to more confusion. A question that arises is how much the student is to blame for their increasing isolation from their classmates. Personal technology is engineered to pull our attention and throw us into a black hole of measuring our worth by our number of followers, how similar we look to celebrities, and comparing every part of our body to an unachievable ideal. When we exist in that mindset, the research our teacher has asked us to do during our class period pales in comparison to checking all of our social media platforms. It isn’t the fault of the teacher for not being interesting enough; it isn’t the fault of the student for not having more self-control. Placing blame doesn’t remove the distraction; we can only achieve a positive environment when we eliminate the distraction from our daily educational experiences. Is this barrier between teachers and students really the educational environment we all hope to experience?

It cannot be denied that technology is integral to the efficiency of workplaces around the world, but it can be said that it hinders creativity. With the resources that technology provides, it is perceived as lazy to sit around brainstorming, when you could simply Google your problem and have the top one hundred tried and true solutions in seconds. Developing your own ideas and beliefs takes exposure to what is out there in our world, but it also takes reflection. There are many great thinkers who we can aspire to have similarity with, but our value comes when we step out from the shadows as an individual. Individuality comes from within, not from DuckDuckGo. It is said that schools are where we, as unique people, are developed, but can that be said when a classroom is full of students who are just presenting whatever was written on Buzzfeed as our own opinion? Researching a topic online gives us facts that we corroborate with several reputable intellectual and news outlets, but students in our time confuse fact with opinion. The opinion comes after we learn the facts; it comes when we implement creativity.  Creativity cannot be supplied by any website; we find it when we put our devices away and let our stream of consciousness flow. In classrooms that don’t allow personal technology, students are required to be alone with their thoughts, to hold several different views on the same matter in their mind, and hone their own feelings on a topic.

Technology also can have the effect of rewiring our brains. Students shorten their attention span by being able to flip through apps. For example, at this exact moment, I have five desktops, five apps, and eighteen tabs open, and that will probably not drastically change until I feel the disorganization and clutter starting to bother me in about a week. Does that make me a bad student? No, but it makes me a distracted one. While every person has the ability to focus on a task at hand, how long before your mind falls to your dashboard and another app begs for your focus? According to a UK news outlet, The Telegraph, there was a study done in May of 2015 that showed the fall of the human attention span from twelve seconds in 2000 to eight seconds in 2015. According to that same article, a goldfish has an attention span of nine seconds. In a study called Attention Span Statistics done by Statisticbrain.com in 2016, an office worker checks their email 30 times per hour and people get impatient with an Internet video after an average of 2.7 minutes. All of these times have dropped significantly with the introduction of personal technology into workplaces and schools. The moment people get bored they switch tasks, which only worsens their attention span. On average, a person must write down a simple fact seven times before they remember it for a short term, but, in order to retain information, we must study in short intervals over an extended period of time, and, with our attention being on short term efficiency, the likelihood of repetition and working ahead is slim to none.

It would be absurd to say that technology is an exclusively negative influence. In today’s workplace, proficiency in technology is not an added bonus of a job candidate, it is a requirement. However, banning personal technology from classrooms is not the same as banning it from education as a whole. Homework can still involve technology; the only process it hinders is class interaction and connection. It is also important to distinguish the difference between teaching about technology and letting students use personal technology independently in the classroom; again, teaching about technology is crucial for the success of a student in the working force, but personal use of technology holds back a student’s creative development. There is something to be said for the dynamic that can be brought into an engaged classroom when a student is able to bring up a relevant article or material that sparks a discussion or debate in a classroom, and those moments are important to learning, as a student has showed interest in a way that pulls them deeper into the material, and to that I must concede that, in that instance, technology is an enhancer in a classroom, but those moments are rare enough that the question of their worth is raised once again.

Personal technology is a distraction to an otherwise creative and highly stimulating environment that supports creativity and independence, and it keeps students from retaining the lessons. To call technology evil or wrong is not the correct approach; it is our use of technology that must be evaluated. We have lost quiet moments of reflection in which we could foster creativity and our passions to glowing screens and comparison to unattainable goals. Our young lives should emphasize education and exploration, not of the limitless reaches of the internet, but in the depths of our minds and characters, and the classroom is where so much of that is developed, so I call our reliance on technology into question.

Holiday Film Recommendations by Kira Cruz

Snuggle up by the fire with a cup of William-Sonoma Classic hot cocoa and enjoy the benefits of the holiday movie season.

Snuggle up by the fire with a cup of William-Sonoma Classic hot cocoa and enjoy the benefits of the holiday movie season. Join Hallmark, Lifetime, Netflix, and Freeform (the new ABC family) for infinite movie marathons and endless moments of wishing we had grown up in a small town, met a young fella, and experienced love at first sight.

Christmas movies are a great way to wind down from the stress of finals and find refuge. Here are some of my favorites/must-sees for this holiday season:

Coming in as #1: The Holiday (2006)

This is a classic. You are not doing the holiday season right without watching this love story with an A-List cast who are not too hard on the eyes!

Could life get any better for Cameron Diaz and Kate Winslet? After seeking refuge and solitude, Cameron Diaz travels to Surrey, England to escape her never-ending work life, and in particular, men! However, Diaz gets lucky and finds a well-groomed, handsome Englishman, Jude Law, and of course they fall madly in love. This relationship between Law and Diaz could not be anymore perfect, and it makes you wonder, “Where is my Jude Law?”

On the other side of the Atlantic, we have a love story between Kate Winslet and Jack Black. Though at first they are just friends, their friendship soon sparks into something much more. Jack Black’s character is particularly appealing to the female audience because of his kindness and utterly hilarious sense of humor, which we all yearn to have. The Holiday is a must see for anybody looking for some romance in their lives without actually going out and finding “Mr. Right,” but rather finding him right at home on your chaise lounge while eating sugar cookies.

Coming in close behind at #2: Elf (2003)

It reminds people that Christmas is not about all the boxes under the tree; rather, it is about bringing others together by spreading joy and happiness.

An oldie but a goodie. As Buddy the Elf, Will Ferrell, a Saturday Night Live alum who never fails to make us laugh, makes the holiday even better when we see his holiday cheer on our screens. If you have not seen Elf yet, I suggest going to amazon.com and buying it, because it is a must see! Buddy helps even those who recoil from the holidays by reminding us that “the best way to spread Christmas cheer is singing loud for all to hear.” His genuine excitement for Christmas gets everyone in the spirit to trim the tree, build a snowman, and bake an unhealthy number of cookies. I daresay this feel-good movie should even be played in class during the holiday season, because it reminds people that Christmas is not about all the boxes under the tree; rather, it is about bringing others together by spreading joy and happiness.

Squeaking in as my #3 favorite Christmas Movie: Unaccompanied Minors (2006)

I watch this movie every Christmas Eve because it reminds me how lucky I am to be with my loved ones instead of stuck at an airport without family on Christmas.

When I first watched Unaccompanied Minors, I wanted to travel alone and wander the airport by myself to be independent and have some freedom. However, traveling is sometimes not very exciting, and it is especially grueling when you lose your luggage and are without your family. This movie reminds me to be grateful that I’m with my mom on the biggest holiday of the year instead of sleeping in an uncomfortable airport chair next to random strangers. Nevertheless, the kids in the film make the best of it by exploring the behind-the-scenes rooms of the airport and by making lifelong friends in the process of running from airport security. The ability of these kids to make the best out of their situation is very admirable, and it teaches me to look for creative solutions during times of bleakness.

Overall, this movie is one of my favorites because the storyline is very easy to relate to, in part because all of the characters are teenagers (or nearly so), and in part because it teaches lessons about teamwork, family, and how to spread the Christmas spirit.

Best Christmas Scene: The Winter Ball from Harry Potter and The Goblet of Fire(2000)

Even though this movie was filmed many moons ago, the special effects and sets are jaw-dropping, and in general, it is a timeless flick.

Have you ever seen anything more beautiful? That was my exact reaction when I first saw this movie at age seven. When I was a youngster, I used to think prom would look like this, but sadly, the Harry Potter Winter Ball and Catalina Prom are not in the same lane. It makes me wish I wasn’t a muggle and that I had attended Hogwarts; life is simply not fair sometimes.

This scene is the epitome of winter wonderland. It has ice sculptures, multiple Christmas trees, formal attire, romance, MEN, and of course, magic. Moreover, the beauty of Hermione as she walks down the three flights of stairs to the arms of Viktor Krum can be nothing other than GOALS. She looks so effortlessly beautiful and walks with ease even with everyone’s eyes on her. In that moment, Hermione reminds every little girl that smart girls can get the guy and have the brains, too.

Hermione reminds every little girl that smart girls can get the guy and have the brains, too.

Harry Potter and the Goblet of Fire is a great movie to watch with friends on a rainy day in Monterey in December when you are in need of a little fantasy and mystery in your life. However, if you do not want to see the death of SPOILER ALERT Cedric Diggory, played by Robert Pattinson (from Twilight), then I suggest you grab some blankets and watch Elf because there is nothing sad about Elf.

Kickin’ It with Kaylaa

PODCASTS

Kickin’ It With Kaylaa

Kaylaa Kawasaki ’17 is a tennis player and has a strong passion for all things sports. In her podcast, Kaylaa interviews Athletic Director Paul Elliott, student athletes, former athletes, and faculty. She focuses on their perspective of the Santa Catalina athletics program and gives updates on how the teams are doing.

The Greater Good by Loleï Brenot

 

Since the 2001 September 11 terrorist attacks, the United States has declared a “War on Terror.” While this war on terror has led to positive outcomes, such as the death of Osama bin Laden and the creation of the Department of Homeland Security, other actions have led to questioning of the government’s and the president’s true constitutional powers. The actions taken by the Bush Administration in response to these attacks, chiefly the creation of the Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorists (USA PATRIOT) Act led to a questioning of the legality of mass public surveillance. Although one’s individual and constitutional rights are infringed upon with these governmental security measures, one must consider the grand scheme of things and the future generations of our world. The Western way of life and continuation of the growing of terrorism at an unprecedented scale are not conducive for a successful future world. Thus, the United State’s government and the National Security Agency do have the authority to surveil the residents of the United States of America, as this is a protective and offensive action done to protect the United States and its citizens from the ever growing and prevalent threats of both domestic and international terrorism.

Since knowledge of the National Security Agency’s surveillance came to light, first in 2005 by the New York Times, and then again in 2013 through Edward Snowden, there has been a national outbreak of outrage directed towards the PATRIOT Act. Many Americans have spoken out about the unconstitutionality of it as well as the severity of the government’s infringement upon individual rights. The Bush administration was immediately attacked, as President Bush had seemingly gone over the checks-and-balances of the American system and executively ordered the unconstitutional surveillance of the American people. As reported by Risen and Lichtblau in the 2005 New York Times article, “Bush Lets U.S. Spy on Callers Without Courts,” government officials said, “Under a presidential order signed in 2002, the intelligence agency has monitored the international telephone calls and international e-mail messages of hundreds, perhaps thousands, of people inside the United States without warrants over the past three years in an nsa-spy-machineeffort to track possible ‘dirty numbers’ linked to Al Qaeda.” Despite the NSA releasing information following this regarding how communications are used. The NSA info graphic on the right shows how one’s communications that re run through the government’s surveillance program. This, along with outside information regarding how one’s data is processed, clearly shows that the NSA internet surveillance program, PRISM, processes the information and only sends it on to another round of deeper examination if something troubling is flagged. However, despite this, apprehension and distrust still remains, mostly due to the government’s lack of clarity on the issue at the beginning of its creation and use. Furthermore, this secretive spying on Americans has faced strong opposition by both major political parties. Even with past measures put into place to prevent such infringement upon one’s rights, chiefly The National Security Act of 1947, which specifically prohibits domestic intelligence operations, actions contrasting these laws were nonetheless taken. However, though this surveillance, dozens of terrorist attacks, both international and domestic, have been thwarted, as testified by NSA Director General Keith B. Alexander at the June 18, 2013 House Intelligence Committee hearing.

While many believe that the government has taken unconstitutional actions, many instead believe that the world has come to a place where some sacrifices must be made for the greater good. Citing the “Country Before Self” argument, the belief is that citizens must be willing to part with their own rights that their predecessors fought for long before them in order to ensure the safety and security of their country.

As President Kennedy stated in his inaugural address, “Ask not what your country can do for you, but what you can do for your country.” This concept of country first is a value instilled in the spirit of Americans of all color and creed and thus, these government actions were taken in order to ensure the safety of the United States and its citizens.

Particularly because in the United States, with the mantra and mentality of “country before self” being so strong and prevalent throughout much of the population, surveillance is seen as a small sacrifice to ensure the safety of the American dream.

Individual rights are among the utmost important rights for American citizens, as the individual rights of man are what spurred the birth of the United States in the first place, over 200 years ago. Apart from infringement upon individual rights, other damages have caused by the surveillance, such economic loss due to tax dollars being funneled into the NSA and a lack of trust in the US government. While these are further costs of surveillance, they are minimal in the long run if they lead to a safer world. Furthermore, often cited in the argument against surveillance is Benjamin Franklin’s statement, “Those who give up essential liberty to purchase a little temporary safety deserve neither liberty nor safety.” While this quote can be effective in touting the belief that the Founding Fathers would have strenuously disagreed with this surveillance, Franklin does specify “temporary safety.” In the case of terrorism, however, the world is not fighting for temporary safety but rather to ensure the blessed continuation of life as it is currently known for generations in the future.

While mass surveillance raises the issue of the ever-growing power of the federal government, if people are truly concerned with keeping with the Founding Fathers’ ideals of a balanced, centralized government, the president and their power must be kept in check. In this day, civil rights must be more clearly defined, as the world has changed so rapidly and drastically in the past 200 years since our country’s birth. The Founding Fathers could never have thought to take something such as mass surveillance into account, and thus, suitable government actions must be more clearly outlined for the world to see. As Alexander Hamilton did specify in Article 23 of the Federalist Papers that “it is impossible to foresee or define the extent and variety of national exigencies, or the correspondent extent and variety of the means which may be necessary to satisfy them,” arguments can be made that in the case of the War on Terror, actions taken by the Bush administration were within reason of America’s founding ideals.
In the great words of President Reagan, “there should be no place on earth where terrorists can rest and train and perfect their deadly skills. I meant it. I said that we would act with others if possible to ensure that terrorists have no sanctuary anywhere.” The deadly poison that is terrorism has not yet been stamped out, but the government of America has a duty to continue on with all efforts to do so. Over the past eight years, with an overly politically correct and dodging approach to the prevalent issue, the American public has lacked a strong leader willing to stand up and condemn these horrifying actions. Thus, the lack of faith creeping in at the end of President George W. Bush’s term on the War on Terror has only continued to grow. Ultimately, until the threat of terrorism is under control, faith and trust must be placed into the hands of the government.

The threat of domestic and international terrorism has driven the United States’ government to take drastic actions in the war on terror, but to take drastic actions with cause.

The fighting for the survival of the American ideal and dream in the long term is one that must be upheld, even if it means compromising certain values in the short term.

New Frontiers in Tech by Katie Gorton

“Above all, we must embrace that quintes­sentially American compulsion to race for new frontiers and push the bound­aries of what’s possible. If we do, I’m hopeful that tomorrow’s Americans will be able to look back at what we did—the diseases we conquered, the social problems we solved, the planet we protected for them—and when they see all that, they’ll plainly see that theirs is the best time to be alive. And then they’ll take a page from our book and write the next great chapter in our American story, emboldened to keep going where no one has gone before.”

President Obama

Recently, President Barack Obama took the time to be the guest editor of the November “Frontier” issue of Wired magazine. President Obama is the first sitting president to ever guest edit a magazine. The main focus of the November issue is to discuss the future. Mr. Obama covers issues ranging from the landing of men and women on Mars to precision medicine and figuring out how the human genome can unlock some of the world’s deadliest diseases.

In the magazine, he discusses his optimism for the future due to the constant churning of scientific progress. He shares stories and ideas about what lies beyond the barriers we haven’t broken yet. All of these innovations, he claims, will make the world better for the planet by discovering solutions to climate change, individuals with new tech involved in medicine, and communities.

In a new video series on Wired’s Youtube channel, President Obama discusses the exciting future of Artificial Intelligence. To explain, artificial intelligence is intelligence exhibited by machines. AI would allow cognitive functions humans associate with other human minds such as reasoning, problem-solving, etc. to be evident in computers. This process is not going to happen overnight, but it is predicted to occur in twenty or thirty years to come.

He states that AI will make the world safer by eliminating human error (i.e., self-driving cars) and stimulating the economy, but recognizes that AI also could have downsides that lead to eliminating jobs, increasing inequality, and suppressing wages. There are even concerns that AI could surpass our ability to understand it and machines will end up doing everything without humans, therefore decreasing jobs, but it is important to remember that these scenarios are all hypothetical. Many are becoming confused with the science fiction that surrounds this idea. It seems, however, that there is more good than bad to come from it. Mr. Obama relates the purpose of AI to the way we view using calculators as an extension of our intelligence and as a way to create good rather than causing harm.

AI is currently being developed to diagnose diseases along with developing treatments for these diseases as well as self-driving cars that will be much safer than those with human drivers. There is myriad of other areas of tech the President touches on in the November issue. The new “Frontiers” Edition of Wired magazine was released on Oct. 25th, and I hope all of you will enjoy it!