Temptation by Emma Patterson

Imagine sitting in a classroom and participating in a debate. Everyone has a pencil and paper out on their desk, and, while they scribble down notes every few minutes, their focus is primarily on actively participating in the discussion and following all of the lines of reason being drawn. Freeze that frame in your mind. Replace the pencils and paper with laptops and tablets; now, instead of hearing voices of your peers command the room, hear the clicks of people typing on a keyboard paired with occasional murmurs of an idea or a teacher asking, “does anyone have any thoughts at all? The engaging environment has been eliminated, and in its wake is an awkwardly silent, disconnected group of people waiting out a forty-five minute period.

I have sat in far too many classrooms in which there have been walls between my classmates and me created by technology.

In our freshman year, we were required to have iPads to do our work. Now we have graduated to laptops, but the result is still the same: no one is ever completely on task. Personal technology is a distraction to an otherwise creative and highly stimulating environment that supports creativity and independence, and it keeps students from retaining the lessons.

It seems dramatic, but it’s all very simple. When a student gets bored of the lesson or tired of paying close attention, they stop listening and start noticing the red circles appearing on their apps notifying them that someone else is trying to talk to you, from there they ask themselves who it is, what they need, whether or not it is important, and then it is almost impossible to keep themselves from clicking on their messages app and starting up a new, probably more interesting, conversation, and, once they stop paying attention, they will not hear the rest of the teacher’s lesson. On EdTech, a website dedicated to researching technology’s role in education, the technology available in classrooms stretches from interactive whiteboards to cell phones1.

There is an important distinction to be made: educational technology is technology used by a teacher to enhance a lesson using some sort of media supplement, while personal technology, in the classroom, is used by a student with the idea that they will use it for educational purposes by the teacher’s direction.

A classroom is a place designed to foster a student’s ability to problem solve, develop their character, and ask questions, and that cannot happen authentically when they have the ability to Google answers that they could’ve gotten to on their own. Technology has a tendency to enhance laziness more than it does education. The social aspect of a school has been turned on its head since the introduction of social media.  This has created a culture of constant negativity and cyberbullying. As I stated earlier, overcoming the instinct to click on the notification button is impossible, and, when a student is experiencing bullying, the student will never feel safe, as they are tortured every moment they have their devices on them. This creates hostility that stretches into every moment in a classroom; school can no longer be a haven in which students are given time to be distracted from negativity and pressure by being allowed to channel their energy into creativity, growth, and knowledge. Finally, technology gives us the illusion of multitasking. Students claim they can watch TV, do math homework, and hold four conversations on 3 different platforms; in reality, this is not the case. Instead of learning to give their attention to one task at a time to ensure it is done to the best of their ability, students learn to prioritize efficiency over quality. They believe that they have no cracks in a perfect system, but really they are losing the ability to retain information and extend their attention span.

Technology addictions are classified as a legitimate addictions by several research institutions, including Stanford University. As discussed in the CBS News article “Internet addiction changes brain similar to cocaine: study”, which was published in January of 2012, technology affects our emotions, decision-making, and self-control. People lose the ability to connect with their peers and loved ones. When that dynamic is brought into a classroom, students with underdeveloped brains do not have the ability to overpower their dependence on technology. Instead of wanting to connect with the diverse environment of a classroom, they want to check the likes on their latest Instagram post. They’d prefer looking at relatable memes to participating in a discussion on American foreign policy. As a student, I can say that, as soon as I get off track on my computer in a class, it is incredibly rare that I will end up getting back on track before class finishes, and then it begins a cycle. After a student stops paying attention, they fall behind, so they quickly get frustrated, and stop paying attention, which only leads to more confusion. A question that arises is how much the student is to blame for their increasing isolation from their classmates. Personal technology is engineered to pull our attention and throw us into a black hole of measuring our worth by our number of followers, how similar we look to celebrities, and comparing every part of our body to an unachievable ideal. When we exist in that mindset, the research our teacher has asked us to do during our class period pales in comparison to checking all of our social media platforms. It isn’t the fault of the teacher for not being interesting enough; it isn’t the fault of the student for not having more self-control. Placing blame doesn’t remove the distraction; we can only achieve a positive environment when we eliminate the distraction from our daily educational experiences. Is this barrier between teachers and students really the educational environment we all hope to experience?

It cannot be denied that technology is integral to the efficiency of workplaces around the world, but it can be said that it hinders creativity. With the resources that technology provides, it is perceived as lazy to sit around brainstorming, when you could simply Google your problem and have the top one hundred tried and true solutions in seconds. Developing your own ideas and beliefs takes exposure to what is out there in our world, but it also takes reflection. There are many great thinkers who we can aspire to have similarity with, but our value comes when we step out from the shadows as an individual. Individuality comes from within, not from DuckDuckGo. It is said that schools are where we, as unique people, are developed, but can that be said when a classroom is full of students who are just presenting whatever was written on Buzzfeed as our own opinion? Researching a topic online gives us facts that we corroborate with several reputable intellectual and news outlets, but students in our time confuse fact with opinion. The opinion comes after we learn the facts; it comes when we implement creativity.  Creativity cannot be supplied by any website; we find it when we put our devices away and let our stream of consciousness flow. In classrooms that don’t allow personal technology, students are required to be alone with their thoughts, to hold several different views on the same matter in their mind, and hone their own feelings on a topic.

Technology also can have the effect of rewiring our brains. Students shorten their attention span by being able to flip through apps. For example, at this exact moment, I have five desktops, five apps, and eighteen tabs open, and that will probably not drastically change until I feel the disorganization and clutter starting to bother me in about a week. Does that make me a bad student? No, but it makes me a distracted one. While every person has the ability to focus on a task at hand, how long before your mind falls to your dashboard and another app begs for your focus? According to a UK news outlet, The Telegraph, there was a study done in May of 2015 that showed the fall of the human attention span from twelve seconds in 2000 to eight seconds in 2015. According to that same article, a goldfish has an attention span of nine seconds. In a study called Attention Span Statistics done by Statisticbrain.com in 2016, an office worker checks their email 30 times per hour and people get impatient with an Internet video after an average of 2.7 minutes. All of these times have dropped significantly with the introduction of personal technology into workplaces and schools. The moment people get bored they switch tasks, which only worsens their attention span. On average, a person must write down a simple fact seven times before they remember it for a short term, but, in order to retain information, we must study in short intervals over an extended period of time, and, with our attention being on short term efficiency, the likelihood of repetition and working ahead is slim to none.

It would be absurd to say that technology is an exclusively negative influence. In today’s workplace, proficiency in technology is not an added bonus of a job candidate, it is a requirement. However, banning personal technology from classrooms is not the same as banning it from education as a whole. Homework can still involve technology; the only process it hinders is class interaction and connection. It is also important to distinguish the difference between teaching about technology and letting students use personal technology independently in the classroom; again, teaching about technology is crucial for the success of a student in the working force, but personal use of technology holds back a student’s creative development. There is something to be said for the dynamic that can be brought into an engaged classroom when a student is able to bring up a relevant article or material that sparks a discussion or debate in a classroom, and those moments are important to learning, as a student has showed interest in a way that pulls them deeper into the material, and to that I must concede that, in that instance, technology is an enhancer in a classroom, but those moments are rare enough that the question of their worth is raised once again.

Personal technology is a distraction to an otherwise creative and highly stimulating environment that supports creativity and independence, and it keeps students from retaining the lessons. To call technology evil or wrong is not the correct approach; it is our use of technology that must be evaluated. We have lost quiet moments of reflection in which we could foster creativity and our passions to glowing screens and comparison to unattainable goals. Our young lives should emphasize education and exploration, not of the limitless reaches of the internet, but in the depths of our minds and characters, and the classroom is where so much of that is developed, so I call our reliance on technology into question.

Women’s Sports Broadcasting: Four Decades Behind by Audrey Bennett

The number of women athletes has steadily increased in the years since the passing of Title IX in 1972. However, TV and media coverage of women’s athletics has failed to keep up. Shockingly, a 20-year-long USC study of ESPN and other Los Angeles sports coverage outlets revealed that only 3.2% of airtime is dedicated to women’s sports, which is less than the reported 5% in 1989. Readers of prominent newspapers encounter a similar dismal ratio of women’s to men’s athletics coverage. While there is plenty of space for a thorough story on various jersey numbers that are possible for a NFL player (an article featured in an October 2016 New York Times paper), coverage of the WNBA finals is virtually nonexistent. The sort of unequal coverage perpetuates stereotypes and conformity to gender roles while marginalizing the dynamic social changes that have occurred over the last 25 years. Significantly increasing attention to women’s sports through broadcasting, newspapers, and magazines would positively affect the way women feel and the way women are treated in society.

Only 3.2% of airtime is dedicated to women’s sports, which is less than the reported 5% in 1989

Media and news coverage, often referred to as the “fourth branch of government,” holds a unique position of power in the US. With this power comes great responsibility. These news outlets, whether it be Sports Illustrated, ESPN, or even the local paper covering high school sports stories, have a moral obligation to strive for fair and equal coverage. USC sociologist Mike Messner notes that “news programs are supposed to be a window to the world and there is a journalistic responsibility to reflect that.” However, these news outlets have repeatedly failed to accurately represent the true demographic of athletes and fans. A 2014 USC study led by Mike Messner and Cheryl Cooky examined three Los Angeles network affiliates and found that they collectively ran 60 stories on the March 2009 men’s NCAA basketball tournament. Zero stories were featured on the women’s NCAA tournament of that year.

There seems to be hardly enough time for depth and breadth of coverage for women.

Completely ignoring the parallel women’s tournament demonstrates the preferential treatment that men’s sports receives on a daily basis. Although there are more ticket sales for the men’s tournament, a 60:0 ratio disregards the thousands of fans of the women’s teams. Also, it is not as if crucial topics of men’s sports dominate every moment of airtime. While there is plenty of time for marginal stories about where former Lakers player Kendall Marshall will find a good burrito in Milwaukee (a story featured in a July 2014 release from USA Today), there seems to be hardly enough time for depth and breadth of coverage for women. However, if we generate sports coverage of equal quality and quantity between the sexes, young girls will grow up with more visible female athlete role models, and both boys and girls will see that athletic pursuits are not merely for males.

What’s more, during that small percentage of air time devoted to women’s sports, the quality of coverage is not even equal. Researchers of Messner’s USC study noted that broadcasters relate the news of women’s sports in a more stoic and humorless approach, which suggests that viewers and broadcasters alike must brace themselves and endure a short segment on women before returning to the joyful and joke-filled coverage of men’s sports. As children and adults listen and read to this type of reporting, they come to unconsciously accept it as truth. By diversifying coverage, the fight for equal treatment for both sexes would benefit. The current state of broadcasting perpetuates long-held prejudices that women and women’s athletics are somehow inferior and unworthy of our attention.

By consciously using about 97% of airtime to cover men’s athletics, broadcasting stations seem to think that men’s sports are the only type of sports that will attract viewers.  Many news outlets would defend their decisions as purely economically based, perhaps only reflecting ticket sales of WNBA to NBA games to attract the largest demographic of viewers. However, it is the fault of circular reasoning if broadcasting outlets and newspapers blame popularity for their biased coverage. Sports teams gain popularity through media coverage, yet broadcasting groups repeatedly refuse to fairly feature women’s sports teams because they are not popular enough. How will women’s sports gain comparative popularity to men’s if they only get minimal coverage? Also, studies demonstrate that the interest and participation in women’s athletics is quite respectable. In fact, in the women’s 2015 soccer World Cup, 3.311 million viewers tuned it, making it the most-watched soccer match on FoxSports1. While some stations claim that their job is to reflect interest of the current audience rather than generating new audiences, Cheryl Cooky, the associate professor of gender, sexuality, and women’s studies at Purdue University, points out that “that is in one sense a false logic because the interest is there … [and] that particular logic lets sports media off the hook. Displacement of blame onto the audience or consumer removes any sort of accountability on their part…” Thus, if sports channels and newspapers are truly trying to reflect their interest groups, the coverage of women’s sports would be much greater.

Since the passage of Title IX in 1972 that prohibited sex-based discrimination in federally funded activities, the number of women in high school, collegiate, and professional level sports has sky-rocketted. The Women’s Sports Foundation reminds us that in 1971, the number of high school girls involved in interscholastic sports was about 294,000. Today it is closer to 3.1 million, which is significantly closer to the 4.4 million boys that play such sports. According to Running USA, there are now more women runners than men (10.7 million women participating in running races compared to 8 million men).

Just as African Americans had to fight for equal treatment and respect even after the necessary legislation was passed, the battle is still being fought for female athletes.

Some would argue that at this steep rate, women’s sports do not need additional support or coverage because they seem to be thriving under these circumstances. However, while the participation in women’s athletics has increased rapidly, the respect and treatment it receives has failed to catch up. The predominantly male sports broadcasters (only about 5% of sports anchors are women according to the 2014 USC study) still believe that this boom in interest in women’s athletics is not worthy of airtime. Women’s sports are still decades behind in the treatment and pay they receive. Just as African Americans had to fight for equal treatment and respect even after the necessary legislation was passed, the battle is still being fought for female athletes.

This battle transcends the world of sports broadcasting.

Media plays a key role in this battle. A different approach to women’s sports coverage could begin to shift expectations, gender roles, and persistent sexism that women athletes around the world face every day. However, this battle transcends the world of sports broadcasting. The simple demand for equal treatment and respect for women still faces resistance in the workplace, in politics, and in the home. If men’s and women’s sports eventually  receive equal coverage, perhaps it influence our perceptions of gender and its role in determining one’s worth in society.

Give It a Shot by Sarah Lamp

Children are inherently vulnerable to a wide variety of dangers. No one knows this better than a parent, and certainly, almost any parent would agree with the statement that they would do anything to protect their child. However, in some cases, people may disagree about whether something is helpful or in fact harmful; one such case is vaccines. Although a majority of parents happily vaccinate their children and themselves, others refuse vaccines for their children on the grounds that vaccinations supposedly harm mental abilities. Unfortunately, by doing this they are in fact making their children, and the children around them, vulnerable to many terrible diseases. There is no irrefutable scientific evidence linking vaccines to an increased risk of disease or disabilities in children, meaning that parents who do not vaccinate their children gain nothing but risk a great deal. Perhaps the most significant reason that many distrust vaccines is chronic misinformation, as incorrect findings are distributed as facts and convince parents that vaccines are dangerous, thereby leaving children more exposed and vulnerable to disease than they otherwise would be. Parents are supposed to protect their children: vaccines are just one more way to do this.

For a parent, this is not just about facts and figures: it’s about the life of their child.

While the parents may think that they are doing what is best for their kids, proof that vaccines cause diseases such as autism spectrum disorder (ASD) is oftentimes thin and based on circular reasoning or manipulated data. However, many parents are not given the breadth of information necessary to make an informed decision. On the website of the non-profit advocacy organization Voices for Vaccines, one mother, Chrissy, reveals her experiences with both the pro- and anti-vaccine movements, making a critical point regarding how medical professionals can spur parents into a panic over vaccines by misdiagnosing young children with developmental delays as being autistic. This can send already-anxious parents into a frenzy. In her article, Chrissy writes that “At first I was relieved because my worries had finally been validated. Then I was angry and convinced that my child had been damaged by the vaccines he had gotten,” clearly expressing the emotional turmoil that surrounds the vaccine debate because, for a parent, this is not just about facts and figures: it’s about the life of their child. Vaccines provide an opportunity to save lives and stave off disease – imagine what a difference a vaccine would have made during the Spanish Influenza epidemic a hundred years ago, how many lives could have been saved. When there is an opportunity to have a slightly higher chance of  preventing a disease and keeping their children healthy, parents have an unspoken obligation to always seize it, or to at the very least make sure to understand what it entails.

Still, the facts and figures are important. Statistics alone prove that it is unspeakably foolish to leave children unprotected from horrific diseases: all one needs to do is look back at the time before vaccines, when even the President of the United States was not safe from polio. According to an article by Dina Fine Maron for the Scientific American in 2015, the United States Center for Disease Control (CDC) estimates that “among children born in the past two decades vaccinations will prevent more than 20 million hospitalizations and 732,000 deaths,” figures which highlight the dramatic difference vaccines can make.

To not vaccinate is a very selfish decision, as it affects not only one’s own child but also other children around them: to have one child unvaccinated is to potentially expose an entire school to a disease. Vaccines are developed for the very purpose of protecting people, and are designed to be safe for children; exhaustive clinical trials and tests are required by the CDC for the very purpose of ensuring that the drugs are as safe as is possible. While it is true that genetic variation and immune deficiencies can sometimes result in a bad reaction, the chance of a child beings diagnosed with a disease or disability as a direct result of a preventative vaccines is far lower than the risks of infectious disease an unvaccinated child faces. It is unfortunate that facts such as these are often misinterpreted or excluded from anti-vaccine forums, as positive, correct, information plays an important role in lessening the stigma around vaccines and convincing parents.

Furthermore, no government organization nor laudable scientific community has, as of yet, put forth any proof showing that vaccines, nor any ingredients in vaccines (specifically the mercury-based preservative thimerosal) cause autism; in fact, the CDC soundly refutes any claims to that effect, asserting on their website that “since 2003, there have been nine CDC-funded or conducted studies that have found no link between thimerosal-containing vaccines and ASD, as well as no link between the measles, mumps, and rubella (MMR) vaccine and ASD in children.” Despite this, in a 2011 article titled “Straight Talk about Vaccination,” Matthew F. Daley and Jason M. Glanz share the troubling results of a survey over 1,500 parents, in which “one quarter […] [believed] that vaccines can cause autism in healthy children, and more than one in 10 had refused at least one recommended vaccine.” The reason for this level of ignorance is that parents have been consistently offered false information via the internet or personal anti-vaccination campaigns, or have not been corrected by medical practitioners. It is for this reason that a mandatory forum is needed to educate all new parents about vaccines, such as a  required, government-sponsored, information session about vaccinations which can address any and all fears. The problem is never that parents do not want to help their children: it’s that they are no longer sure what is best.

A mandatory forum is needed to educate all new parents about vaccines.

Yet, in the face of what is oftentimes irrefutable evidence, some anti-vaccine advocates persist in denouncing vaccines not just as causes of disabilities but also as a way for large pharmaceutical companies to exploit parents. According to this sector of the population, Image B, which was posted to an online forum titled “Diabolical Pro-Vaccination Campaign”, is just another example of coercive techniques designed to trick parents into poisoning their children. This reasoning, already weak, pales considerably when Dina Fine Maron reminds the world again in the Scientific American that the physician who was initially responsible for spreading the idea that vaccines are linked to autism was “barred from practicing medicine due to ethical lapses,” something which is often glossed over by devotees of the anti-vaccination movement, along with the fact that “more than a dozen studies [by expert organizations such as the American Academy of Pediatrics (AAP) and the Institute of Medicine (IOM)] have added to the body of evidence that this link does not exist.” Until there is definite proof showing the connection between vaccines and autism or other diseases, the idea that vaccines are always harmful is based on nothing more than pseudo-science and speculation.

Until such time as there is quantifiable evidence clearly showing that vaccines do cause more harm than good, parents have a duty to vaccinate their children.

The claims of the anti-vaccine movement have no scientific validity. Nonetheless, there is still the potential for real tragedy as a result of simple ignorance, and the culture of misinformation that has grown around vaccines.  Parents are trying to do the best they can for their children, but without having all the correct facts, it is difficult to make an informed decision. It is a situation that is unfair to everyone, but most so to the children themselves, both those who are vaccinated and yet are at risk from the unvaccinated, and the unvaccinated themselves, who are exposed to potentially life-threatening diseases. To help combat this, the government and healthcare practitioners should  step up efforts to reach and educate people, especially parents, who are wary about vaccines, as vaccinations are currently the best way to combat diseases and help improve general health. Ultimately, until such time as there is quantifiable evidence clearly showing that vaccines do cause more harm than good, parents have a duty to vaccinate their children. If they are truly not swayed by statistics alone, perhaps they should consider whether it would be better to have a living child with autism as a result of a vaccine, or a child who has died as the result of a disease for which they were not vaccinated.

Holiday Film Recommendations by Kira Cruz

Snuggle up by the fire with a cup of William-Sonoma Classic hot cocoa and enjoy the benefits of the holiday movie season.

Snuggle up by the fire with a cup of William-Sonoma Classic hot cocoa and enjoy the benefits of the holiday movie season. Join Hallmark, Lifetime, Netflix, and Freeform (the new ABC family) for infinite movie marathons and endless moments of wishing we had grown up in a small town, met a young fella, and experienced love at first sight.

Christmas movies are a great way to wind down from the stress of finals and find refuge. Here are some of my favorites/must-sees for this holiday season:

Coming in as #1: The Holiday (2006)

This is a classic. You are not doing the holiday season right without watching this love story with an A-List cast who are not too hard on the eyes!

Could life get any better for Cameron Diaz and Kate Winslet? After seeking refuge and solitude, Cameron Diaz travels to Surrey, England to escape her never-ending work life, and in particular, men! However, Diaz gets lucky and finds a well-groomed, handsome Englishman, Jude Law, and of course they fall madly in love. This relationship between Law and Diaz could not be anymore perfect, and it makes you wonder, “Where is my Jude Law?”

On the other side of the Atlantic, we have a love story between Kate Winslet and Jack Black. Though at first they are just friends, their friendship soon sparks into something much more. Jack Black’s character is particularly appealing to the female audience because of his kindness and utterly hilarious sense of humor, which we all yearn to have. The Holiday is a must see for anybody looking for some romance in their lives without actually going out and finding “Mr. Right,” but rather finding him right at home on your chaise lounge while eating sugar cookies.

Coming in close behind at #2: Elf (2003)

It reminds people that Christmas is not about all the boxes under the tree; rather, it is about bringing others together by spreading joy and happiness.

An oldie but a goodie. As Buddy the Elf, Will Ferrell, a Saturday Night Live alum who never fails to make us laugh, makes the holiday even better when we see his holiday cheer on our screens. If you have not seen Elf yet, I suggest going to amazon.com and buying it, because it is a must see! Buddy helps even those who recoil from the holidays by reminding us that “the best way to spread Christmas cheer is singing loud for all to hear.” His genuine excitement for Christmas gets everyone in the spirit to trim the tree, build a snowman, and bake an unhealthy number of cookies. I daresay this feel-good movie should even be played in class during the holiday season, because it reminds people that Christmas is not about all the boxes under the tree; rather, it is about bringing others together by spreading joy and happiness.

Squeaking in as my #3 favorite Christmas Movie: Unaccompanied Minors (2006)

I watch this movie every Christmas Eve because it reminds me how lucky I am to be with my loved ones instead of stuck at an airport without family on Christmas.

When I first watched Unaccompanied Minors, I wanted to travel alone and wander the airport by myself to be independent and have some freedom. However, traveling is sometimes not very exciting, and it is especially grueling when you lose your luggage and are without your family. This movie reminds me to be grateful that I’m with my mom on the biggest holiday of the year instead of sleeping in an uncomfortable airport chair next to random strangers. Nevertheless, the kids in the film make the best of it by exploring the behind-the-scenes rooms of the airport and by making lifelong friends in the process of running from airport security. The ability of these kids to make the best out of their situation is very admirable, and it teaches me to look for creative solutions during times of bleakness.

Overall, this movie is one of my favorites because the storyline is very easy to relate to, in part because all of the characters are teenagers (or nearly so), and in part because it teaches lessons about teamwork, family, and how to spread the Christmas spirit.

Best Christmas Scene: The Winter Ball from Harry Potter and The Goblet of Fire(2000)

Even though this movie was filmed many moons ago, the special effects and sets are jaw-dropping, and in general, it is a timeless flick.

Have you ever seen anything more beautiful? That was my exact reaction when I first saw this movie at age seven. When I was a youngster, I used to think prom would look like this, but sadly, the Harry Potter Winter Ball and Catalina Prom are not in the same lane. It makes me wish I wasn’t a muggle and that I had attended Hogwarts; life is simply not fair sometimes.

This scene is the epitome of winter wonderland. It has ice sculptures, multiple Christmas trees, formal attire, romance, MEN, and of course, magic. Moreover, the beauty of Hermione as she walks down the three flights of stairs to the arms of Viktor Krum can be nothing other than GOALS. She looks so effortlessly beautiful and walks with ease even with everyone’s eyes on her. In that moment, Hermione reminds every little girl that smart girls can get the guy and have the brains, too.

Hermione reminds every little girl that smart girls can get the guy and have the brains, too.

Harry Potter and the Goblet of Fire is a great movie to watch with friends on a rainy day in Monterey in December when you are in need of a little fantasy and mystery in your life. However, if you do not want to see the death of SPOILER ALERT Cedric Diggory, played by Robert Pattinson (from Twilight), then I suggest you grab some blankets and watch Elf because there is nothing sad about Elf.

Kickin’ It with Kaylaa


Kickin’ It With Kaylaa

Kaylaa Kawasaki ’17 is a tennis player and has a strong passion for all things sports. In her podcast, Kaylaa interviews Athletic Director Paul Elliott, student athletes, former athletes, and faculty. She focuses on their perspective of the Santa Catalina athletics program and gives updates on how the teams are doing.

The Greater Good by Loleï Brenot


Since the 2001 September 11 terrorist attacks, the United States has declared a “War on Terror.” While this war on terror has led to positive outcomes, such as the death of Osama bin Laden and the creation of the Department of Homeland Security, other actions have led to questioning of the government’s and the president’s true constitutional powers. The actions taken by the Bush Administration in response to these attacks, chiefly the creation of the Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorists (USA PATRIOT) Act led to a questioning of the legality of mass public surveillance. Although one’s individual and constitutional rights are infringed upon with these governmental security measures, one must consider the grand scheme of things and the future generations of our world. The Western way of life and continuation of the growing of terrorism at an unprecedented scale are not conducive for a successful future world. Thus, the United State’s government and the National Security Agency do have the authority to surveil the residents of the United States of America, as this is a protective and offensive action done to protect the United States and its citizens from the ever growing and prevalent threats of both domestic and international terrorism.

Since knowledge of the National Security Agency’s surveillance came to light, first in 2005 by the New York Times, and then again in 2013 through Edward Snowden, there has been a national outbreak of outrage directed towards the PATRIOT Act. Many Americans have spoken out about the unconstitutionality of it as well as the severity of the government’s infringement upon individual rights. The Bush administration was immediately attacked, as President Bush had seemingly gone over the checks-and-balances of the American system and executively ordered the unconstitutional surveillance of the American people. As reported by Risen and Lichtblau in the 2005 New York Times article, “Bush Lets U.S. Spy on Callers Without Courts,” government officials said, “Under a presidential order signed in 2002, the intelligence agency has monitored the international telephone calls and international e-mail messages of hundreds, perhaps thousands, of people inside the United States without warrants over the past three years in an nsa-spy-machineeffort to track possible ‘dirty numbers’ linked to Al Qaeda.” Despite the NSA releasing information following this regarding how communications are used. The NSA info graphic on the right shows how one’s communications that re run through the government’s surveillance program. This, along with outside information regarding how one’s data is processed, clearly shows that the NSA internet surveillance program, PRISM, processes the information and only sends it on to another round of deeper examination if something troubling is flagged. However, despite this, apprehension and distrust still remains, mostly due to the government’s lack of clarity on the issue at the beginning of its creation and use. Furthermore, this secretive spying on Americans has faced strong opposition by both major political parties. Even with past measures put into place to prevent such infringement upon one’s rights, chiefly The National Security Act of 1947, which specifically prohibits domestic intelligence operations, actions contrasting these laws were nonetheless taken. However, though this surveillance, dozens of terrorist attacks, both international and domestic, have been thwarted, as testified by NSA Director General Keith B. Alexander at the June 18, 2013 House Intelligence Committee hearing.

While many believe that the government has taken unconstitutional actions, many instead believe that the world has come to a place where some sacrifices must be made for the greater good. Citing the “Country Before Self” argument, the belief is that citizens must be willing to part with their own rights that their predecessors fought for long before them in order to ensure the safety and security of their country.

As President Kennedy stated in his inaugural address, “Ask not what your country can do for you, but what you can do for your country.” This concept of country first is a value instilled in the spirit of Americans of all color and creed and thus, these government actions were taken in order to ensure the safety of the United States and its citizens.

Particularly because in the United States, with the mantra and mentality of “country before self” being so strong and prevalent throughout much of the population, surveillance is seen as a small sacrifice to ensure the safety of the American dream.

Individual rights are among the utmost important rights for American citizens, as the individual rights of man are what spurred the birth of the United States in the first place, over 200 years ago. Apart from infringement upon individual rights, other damages have caused by the surveillance, such economic loss due to tax dollars being funneled into the NSA and a lack of trust in the US government. While these are further costs of surveillance, they are minimal in the long run if they lead to a safer world. Furthermore, often cited in the argument against surveillance is Benjamin Franklin’s statement, “Those who give up essential liberty to purchase a little temporary safety deserve neither liberty nor safety.” While this quote can be effective in touting the belief that the Founding Fathers would have strenuously disagreed with this surveillance, Franklin does specify “temporary safety.” In the case of terrorism, however, the world is not fighting for temporary safety but rather to ensure the blessed continuation of life as it is currently known for generations in the future.

While mass surveillance raises the issue of the ever-growing power of the federal government, if people are truly concerned with keeping with the Founding Fathers’ ideals of a balanced, centralized government, the president and their power must be kept in check. In this day, civil rights must be more clearly defined, as the world has changed so rapidly and drastically in the past 200 years since our country’s birth. The Founding Fathers could never have thought to take something such as mass surveillance into account, and thus, suitable government actions must be more clearly outlined for the world to see. As Alexander Hamilton did specify in Article 23 of the Federalist Papers that “it is impossible to foresee or define the extent and variety of national exigencies, or the correspondent extent and variety of the means which may be necessary to satisfy them,” arguments can be made that in the case of the War on Terror, actions taken by the Bush administration were within reason of America’s founding ideals.
In the great words of President Reagan, “there should be no place on earth where terrorists can rest and train and perfect their deadly skills. I meant it. I said that we would act with others if possible to ensure that terrorists have no sanctuary anywhere.” The deadly poison that is terrorism has not yet been stamped out, but the government of America has a duty to continue on with all efforts to do so. Over the past eight years, with an overly politically correct and dodging approach to the prevalent issue, the American public has lacked a strong leader willing to stand up and condemn these horrifying actions. Thus, the lack of faith creeping in at the end of President George W. Bush’s term on the War on Terror has only continued to grow. Ultimately, until the threat of terrorism is under control, faith and trust must be placed into the hands of the government.

The threat of domestic and international terrorism has driven the United States’ government to take drastic actions in the war on terror, but to take drastic actions with cause.

The fighting for the survival of the American ideal and dream in the long term is one that must be upheld, even if it means compromising certain values in the short term.

New Frontiers in Tech by Katie Gorton

“Above all, we must embrace that quintes­sentially American compulsion to race for new frontiers and push the bound­aries of what’s possible. If we do, I’m hopeful that tomorrow’s Americans will be able to look back at what we did—the diseases we conquered, the social problems we solved, the planet we protected for them—and when they see all that, they’ll plainly see that theirs is the best time to be alive. And then they’ll take a page from our book and write the next great chapter in our American story, emboldened to keep going where no one has gone before.”

President Obama

Recently, President Barack Obama took the time to be the guest editor of the November “Frontier” issue of Wired magazine. President Obama is the first sitting president to ever guest edit a magazine. The main focus of the November issue is to discuss the future. Mr. Obama covers issues ranging from the landing of men and women on Mars to precision medicine and figuring out how the human genome can unlock some of the world’s deadliest diseases.

In the magazine, he discusses his optimism for the future due to the constant churning of scientific progress. He shares stories and ideas about what lies beyond the barriers we haven’t broken yet. All of these innovations, he claims, will make the world better for the planet by discovering solutions to climate change, individuals with new tech involved in medicine, and communities.

In a new video series on Wired’s Youtube channel, President Obama discusses the exciting future of Artificial Intelligence. To explain, artificial intelligence is intelligence exhibited by machines. AI would allow cognitive functions humans associate with other human minds such as reasoning, problem-solving, etc. to be evident in computers. This process is not going to happen overnight, but it is predicted to occur in twenty or thirty years to come.

He states that AI will make the world safer by eliminating human error (i.e., self-driving cars) and stimulating the economy, but recognizes that AI also could have downsides that lead to eliminating jobs, increasing inequality, and suppressing wages. There are even concerns that AI could surpass our ability to understand it and machines will end up doing everything without humans, therefore decreasing jobs, but it is important to remember that these scenarios are all hypothetical. Many are becoming confused with the science fiction that surrounds this idea. It seems, however, that there is more good than bad to come from it. Mr. Obama relates the purpose of AI to the way we view using calculators as an extension of our intelligence and as a way to create good rather than causing harm.

AI is currently being developed to diagnose diseases along with developing treatments for these diseases as well as self-driving cars that will be much safer than those with human drivers. There is myriad of other areas of tech the President touches on in the November issue. The new “Frontiers” Edition of Wired magazine was released on Oct. 25th, and I hope all of you will enjoy it!

On Why Hamilton is Everything by Taylor Moises

I am obsessed with Hamilton: The American Musical, and you should be, too. Why? Because Hamilton isn’t just an award-winning musical–it’s a cultural phenomenon that tells “the story of America then told by America now.”

It is topically relevant with American society and politics today while still providing top-notch entertainment for all types of people to enjoy.

Here at Catalina, people either love Hamilton or hate Hamilton. For those of you who have not been enlightened to the greatness that is Hamilton: The American Musical, it is a Broadway musical sharing the story of one of America’s own founding fathers: Alexander Hamilton. Why in the world would a musical about an old, dead guy be so widely raved about? Because the genius behind it (its creator, Lin-Manuel Miranda) created the musical so that it is topically relevant with American society and politics today while still providing top-notch entertainment for all types of people to enjoy. Lin-Manuel was able to liken Alexander Hamilton’s story to that of a contemporary rap artist and then link rap and conventional showtunes as the basis of the musical.

Now for a quick history lesson: Who exactly was Alexander Hamilton? If you are a Hamilton fan, you already know who he was and then some; for others, you have heard of his name if you have taken U.S. History with Mr. Place, and you have seen his face if you have ever seen a ten-dollar bill. Nonetheless, most people do not know much about who Alexander Hamilton was except that he was a founding father and that he died in a duel with Aaron Burr. However, Alexander Hamilton was the first secretary of the treasury of the United States, and he created our financial system. From this information, he does not seem like the obvious basis for a hit musical; however, Hamilton’s formal accomplishments do not sum up all that happened during his short but eventful lifetime. After reading Ron Chernow’s biography of Alexander Hamilton, Lin-Manuel realized Hamilton’s life was full of scandals, duels, and drama perfect for some entertainment. The musical focuses on Alexander Hamilton’s rise from a poor orphan from the Caribbean to George Washington’s right hand man in the Revolutionary War to his trusted aide when Washington became the first president. The musical ends with (spoiler alert!) Hamilton’s death after his duel with Aaron Burr.

One reason to be obsessed with Hamilton, and why I am obsessed with Hamilton, is because of the ingenuity behind its concept and its effective execution, both of which are mostly due to Hamilton’s founding father, Lin-Manuel Miranda. Miranda, along with close friends and colleagues, worked for seven years to create what is now the musical. When he would explain he was working on a rap-musical about Alexander Hamilton, people did not see the connection or how that would work but because they knew Lin, they trusted that whatever it would turn out to be, it would be amazing. They were not wrong to believe in him. In 2009, Lin-Manuel was invited to the White House and he performed what was to become the opening number of Hamilton. The audience, including the Obamas, laughed when he introduced the piece, but by the end of the song, everyone, again including the Obamas, was on their feet applauding. This is the typical response from the time when people first hear about Hamilton to when they finally listen to it and realize how moving it is. With Lin-Manuel’s love for theatre and for hip-hop, he was able to successfully combine both styles in the musical to create this masterpiece.

Beyond the initial brilliance of successfully intertwining contemporary music with show tunes, Hamilton is made even better by the characters and actors behind the characters. Another crucial aspect to Hamilton is its racially diverse cast. All of the principal cast members are people of color. The Puerto Rican creator, Lin-Manuel, plays Alexander Hamilton; the three actors who play the first three presidents of the United States are African-Americans, and the lead actress is Asian-American. This diversity is necessary because Lin-Manuel embedded in Hamilton’s identity the idea of telling the story of the founding of America by the diverse inhabitants of America today. Hamilton could not be as successful as it has been if the cast were all old, white men. By including a racially diverse cast that reflects America today and using music less traditional for musicals, it brings Broadway closer to a less elite crowd and more available to all. It shows that Broadway musicals are not just for white people.

From its conception, Hamilton has attracted many celebrity followings along with popularity among the general public. Celebrities from Beyoncé and Jay-Z to Will Ferrell to Shonda Rhimes and many more have watched the musical at least once. The Obamas have seen the show multiple times and even invited the cast to perform in the White House. The show is in such high demand that the next available tickets are for May of next year. Seniors will be done with classes by the time a show has tickets that are available. While the low quantity of tickets makes the show less accessible, Lin-Manuel and those working on Hamilton are creating new ways for more people to watch the show live. Every day there is a lottery people can enter, and twenty-one people win front-row tickets for ten dollars (versus thousands of dollars). In addition to the lottery, there is often a “#Ham4Ham show” (known as Ham4Ham because winners give a “Ham” (ten-dollar bill) to watch Hamilton) that the cast hosts outside the Richard Rodgers theatre, when they put on an extra show including cast members, crew, or special guests as another way to give more to the Hamilton fan base. There is also a free show on some Wednesdays that New York City students attend if they in turn take a specialized curriculum about Alexander Hamilton and then perform an original piece based on what they learned on the Richard Rodgers’ stage for other schools and the Hamilton cast. These incredible experiences are just some of the many ways the minds behind Hamilton are working to make it available to as many people as they can, especially those who normally would not be able to attend a Broadway performance. Soon there will be Hamilton openings in Chicago, San Francisco, and Los Angeles, a nationwide tour, and a production in London. While it is the single hottest ticket on and off Broadway right now, these expansions and its continual success ensure Hamilton will be available for a long time.

Now, on a more personal level, here is why Hamilton is so amazing to me. As I have mentioned before, it has made Broadway shows more available to people who were never interested in musicals before, but it has also opened the door of rap music to theatre kids. For me, I was a rap fan turned musical fan. I have joked before that I am the “Troy Bolton” of my class–basketball player turned theatre kid–and Hamilton definitely is mostly to blame. It was such a seamless transition because of its hip-hop style and contemporary diction that I became obsessed quickly. I have all the words from every song memorized. Those who have listened to the cast album can attest that it is beyond catchy for musical geeks and just plain catchy for everyone else.

I got to watch it this summer on July, and it was one of the best experiences I have ever had.

You might be skeptical as to how I can be so sure Hamilton is so amazing, but I can assure you as one of the lucky few who have had the privilege of actually watching the show that it is everything I have hyped it up to be. I got to watch it this summer on July 2nd, and it was one of the best experiences I have ever had. The performance was one of the last shows with the original Broadway cast, which is a big deal, especially for this musical, because the roles were tailored for many of the actors. Leslie Odom, Jr., who was able to humanize Aaron Burr, the villain, made the audience empathize with him even when he kills Hamilton. Plus, Lin-Manuel handpicked Daveed Diggs to play Marquis de Lafayette and Thomas Jefferson because of his superb rapping skills and chose Chris Jackson to play George Washington before Lin-Manuel knew Hamilton would be a musical. I was sitting in what was probably the worst spot in the entire theatre, but I can attest that there truly is not a bad seat in the house. Just being in the room where it happened was enough. I only cried seven times because I had to tell myself to keep it together since the tears blurred my vision and impaired my view of the show. Also, because Hillary Clinton was watching at this performance, I could have been in the same room with the next president of the United States.

Now why would something that I make sound so wonderful be hated, like I said it was, at Catalina? Many people hate it because of the obsessiveness from others and me that can be annoying. So yes, I apologize for singing it out loud or making many Hamilton references, but it is relevant in everyday life, making it hard not to quote.  During an election year with candidates knee-deep in mudslinging and scandals, Hamilton involves many current, political issues. Although you will probably get some eye rolls when Hamilton is mentioned at Catalina, when was the last time people, especially those who aren’t theatre enthusiasts, had a strong opinion on a musical? This show has surpassed the conventionality as a musical; it is a cultural phenomenon and an integral piece of pop culture. Hamilton incorporates social and political issues while reaching a broad audience and providing great entertainment and art.

This may still not be striking enough for you. If you cannot see the brilliance behind this Pulitzer-Prize-, Grammy-, and Tony-winning musical, or refuse to give it a shot even after reading this article, my humble efforts have been useless. I don’t see how you can say no to this, but I am satisfied you at least read up to this point. We know the theater kids already play Hamilton non-stop, but it didn’t feel right to throw away my shot at trying to explain Hamilton is genius. One last time, I’ll say that it is worth it to give Hamilton a shot.