My Fall 2014 Social Media Class Project In Review

shepherdcommunication-socialmedia

In the last few posts, I’ve been writing about my Social Media class and the semester project we’ve been doing. To recap, students create a social media content strategy for our department’s social media (the details of the assignment are on the previous post). They then use this plan to create content for the department. They create content 3  times, each time they are creating content for a certain time period. The content is presented to the class and then goes through an editorial process (i.e.., I grade them and make any needed mods) if needed before being published.

With the semester winding down, I want to share some of the work the students have been doing!

Students have done a great job across the semester and have worked hard to try to create content that will resonate with students while also targeting our goals and conveying our key messages. Running an account for a small university department is a unique challenge. Although as professors we are exhilarated by what we teach and have a love and passion for school, it is a bit tougher to get students excited about, well, that part of the college experience responsible for all the work they have to do. :) Believe it or not, school is the last thing some students want to be thinking about when they aren’t in class. :) And I think having to try to overcome the challenge of promoting school is a great experience for students. I’m very pleased with how the students have done in the face of these challenges. The semester began with very little content on our accounts, and few followers.

Students have done a particularly strong job working between groups to create content that works across platforms. For example, you’ll see how some of our blog posts tie into our Instagram and Twitter in regard to profiles of students and faculty.

Our class was divided into 3 groups:

Twitter – Prior to the class starting, we had a Twitter account but it was rarely posted to. Now, we have a variety of content from the informative to reminders of important dates to community-building memes and humorous posts students in our department can relate to.
Blog - The blog is brand new and our department hasn’t done much to publicize it yet. But students have done a great job getting it going. We’ve had highlights of students and insights into classes and events students are apart of. Note: Part of the reason it hasn’t been publicized, is the university is in a tradition stage with its website. And we are waiting to see how that will impact our online presence.
Instagram - Similar to Twitter, Instagram was something we had set up. But hadn’t done much with before the semester. Our Instagram team  began creating videos to highlight professors and students (we’ve had a few sound issues, but are working them out). These videos are accompanied by photos of the individual “behind the scenes.” There are also other photos of other events.

Students have one more round of content they will be turning in this week. And that content will be scheduled to carry us through the winter break.

Altogether, I’m very pleased with how students have worked to help humanize our department and enable current students to connect with one another and perspective students to get a look at who we are and what we do.  I feel we are moving in the right direction.  In the last few days, interest in our content has really taken off as students have reached out and begun highlighting the work their fellow students are doing. I am excited to see how the department’s social media grows and advances in time.

Is this a project I would do again? Absolutely. Students really bought into this project and worked hard to see it through. They expressed to me that they learned a lot from the class and doing this project. And, they said they enjoyed the opportunity to get hands-on experience. It was a very fun semester! I had a great bunch of students and I am very proud of all of them! I plan to continue with this assignment next year.

I’m not teaching Social Media next semester. So where will the department’s social media content for Spring 2014 come from? I’m not entirely sure yet. But I’ve got some ideas in the works and a strong foundation to build upon!

Thoughts? Questions? Recommendations for this project? Would love your comments and feedback below or send me a Tweet.
– Cheers!
Matt

You Can Tweet a Quote Directly From a Pew Report

I teach Comm 335 Writing Across Platforms (see syllabus), a class that in part looks at writing news releases and other content for the web. One tactic we talk about is creating Tweetable content for our social media releases assignment. PitchEngine - the social news release website we use for this assignment – enables users to write ‘quick facts’ that readers can Tweet.

So when I saw today a similar, more streamlined approach used by the Pew Internet project in their reports, I had to make a quick blog post about it. I was reading the Cell Phones, Social media, and Campaign 2014 report when I stumbled across this.

See screen grab below:

Click to enlarge.

Click to enlarge.

I love this tactic – and wish I had thought of it to teach to my students. :) I may just integrate this into my lecture next semester. I wish I had access to stats from Pew to know how effective these are.

Have you seen this before elsewhere? What do you think?  Is this effective – do people want to share pre-written Tweetable quotes, or do they want to be able to put it into their own words?

Sentiment Analysis using Content Analysis Software: Project Assignment

In the last two posts, I’ve been discussing the Yoshikoder sentiment analysis project in my Communication Research class here at Shepherd University.

My first post looked at the project in general. And the second, most recent post, looked at how to teach computer-assisted content analysis using the Yoshikoder computer-assisted content analysis software and the activities I provide my students to prepare them for the project.

I encourage you to check out those posts for background and set up! Ok, now on to sharing the assignment itself and providing a brief overview of it.

As I’ve stated elsewhere, the purpose of this assignment is to

1) give students a hands-on look under the hood of sentiment analysis – that is, to understand HOW it works and its flaws.

2) To teach students via hands=on experience about quantitative content analysis, particularly computer-assisted content analysis

3) To teach them how to conduct a computer-assisted content analysis using software (Yoshikoder)

So here’s the set up to the assignment (which you can see below). This hands-on learning project is based on a real brand and a realistic but made up scenario. I do this with both this assignment, and my first project in this class.  Specifically, I provide The Situation or Problem / Campaign goals and objectives (of an imaginary campaign that is ongoing or happened) / benchmarks / KPIs.

In this case, the situation had to do with a popular online retail brand and rising customer complains and dissatisfaction as the brand has grown beyond its core base of loyal customers in recent years.I’ve redacted the brand and the situation from the below assignment. But you can fill in your own.

I rely on Stacks (2011) model for writing the problem, goals, objectives.  While I provide the research objective(s) in my first project, in this project students must come up with the research objective(s) and RQ(s).

I then provide some benchmarks. In this scenario, at a certain point in time sentiment was strong (let’s say, 70% positive). And then after the hypothetical situation, it dropped (say, to 50%). The students have been recently introduced to the concepts of benchmarks and KPIs via a brief lecture, so this is their first experience with these concepts. They are given 1 KPI (let’s say 65% positive sentiment) against which to measure their success. Keep in mind that the situation assumes that a campaign already took place aimed at addressing decreased customer satisfaction and negative comments on Twitter addressed at the brand of choice. We are now seeking to assess whether this campaign that happened successfully increased sentiment towards the brand (at a deeper level, repaired relationships and the image of the brand among the online community).

There are other important considerations students must make:

1) Since we’ve discussed sentiment and its flaws, they need to think about the valence of sentiment (The AFINN dictionary scores terms from -5 to +5), and they need to research and understand how AFINN was designed and works (I provide some sources to get them started). If you’re not familiar with the AFINN dictionary, it was designed for sentiment analysis of microblogs.It is a free sentiment dictionary of terms you can download and use in Yoshikoder. 

For more details on the assignment, check out the assignment embedded below and the requirements for what must be turned in.

As I’ve noted in a previous post, this project isn’t perfect. But it is a fairly straightforward and accessible learning experience for students who are in their first semester of experiencing how research can be conducted. It covers a wide array of experiences and learning opportunities – from discussion of what sentiment is, to understanding its flaws, to understanding the flaws of quantitative content analysis, to learning to apply a number of key research terms, as well as providing exposure to how to write research reports. The project itself is bolstered by several lectures, it comes about 1/2 way through the semester, and takes several days in the classroom of hands on learning. Students of course finish the writing up outside of class. But we do the analysis all in class to ensure students are getting my help as the “guide on the side.”

My previous post covers some activities we do to build up to this assignment.

So that’s all for now! Please feel to use this assignment, to modify it, and improve it. If you do, come back and share how you have or how you would improve upon it and modify it in the comments below!

If you want to know more about my Communication Research class, please see this post which includes the syllabus.

Teaching Computer-Assisted Content Analysis with Yoshikoder

Last blog post I discussed the second project in my applied research class, a sentiment analysis of Tweets using Yoshikoder - a free computer-assisted content analysis program from Harvard.

As promised, I want to share my assignment, and my handout for students that teaches them how to use Yoshikoder. Before we do the project, however, I do a brief in class activity to get students learning how to use Yoshikoder. So let’s start there for today’s post. And next post, I’ll share the assignment itself.

PART 1: THE SET UP

What I like to do, is present the problem to the students via the project assignment. Then, we go back and start learning what we’d need to do to solve the problem. So, after lecturing about what sentiment analysis is and why it is important, I get students introduced first to the idea of constructing a coding sheet for keywords by taking a list of keywords and adding them to categories.

First, we talk about the idea in class, and I show them some simple examples, like: If I wanted to code a sample for the presence of “sunshine” – what words would I need? Students brainstorm things like  start, sun, sunny, sunshine, etc., etc.

We discuss the importance of mutual exclusivity, being exhaustive, etc.

I show an example from my dissertation which looked at agenda setting topics on Twitter.

On the class day before I introduce Yoshikoder to the class, students do a practice assignment where I give them a list of random terms related to politics and elections. They then have to create “positive” and “negative” content categories using the terms. The terms aren’t necessarily well fit for this exercise, which gets them thinking a bit… They then hand code a sample of Tweets I provide about two different politicians. I tend to use the most recent election. So, in this case Obama and Romney. They are frustrated by having to hand code these Tweets – but a little trick is to do a search for the exact phrases in the Tweet files on the computer and they are done fairly quickly. Ok, so on the next class period:

1) Practice with Yoshikoder We do the same basic task, but this time they learn to program their “positive” and “negative” categories into Yoshikoder. They then load the Tweets (which I have saved as a txt file) and analyze them for the presence of their positive and negative content categories. This is a great point to stop and have students assess the reliability between what they hand coded and what the computer coded. Often, there will be discrepancies. And this makes for a great opportunity for discussion.

Here is the activity that I use in class. I also provide Tweets that I’ve downloaded using the search terms for the politician/candidate I’m using in the activity (e.g., Obama; Romney) in plain text format so Yoshikoder can read it. Also, see the below handout which I provide students to show them how to use Yoshikoder and how to program, and run the analyses I just described.

As I mentioned above, I create a handout that I like to give students that explains the different functionalities of Yoshikoder and how to run the analyses. As I’ve discussed elsewhere, I like to provide handouts. And the one below isn’t one of my more elaborate handouts. But it provides a quick overview with some screen shots to show what buttons need to be clicked. This is super helpful if you are trying to learn Yoshikoder, or want to use it alongside the activity (discussed in this post or the project discussed in my last post, and which I will provide in my next blog post).


Enjoy! .

EDIT: The assignment is now up. See the post.

If you’d like to learn more about using Yoshikoder, I found this great tutorial:

- Cheers! Matt

Applied Research Class: Sentiment Analysis Project Reflection

I began this semester with the intention of blogging a bit about my applied research class. I provided an overview of it and a copy of the syllabus on an earlier post. But since writing that post, I’ve yet to do a follow up… until now.

Edit: There are 2 follow up posts to this post. 1) Looks at activities for this assignment, and 2) provides the assignment itself.

First, let me say that more and more I am trying to decrease my lecturing and spend more time in class with hands on learning, having my students learn by doing rather than just listening – sort of like the flipped classroom Gary Schirr has been discussing recently on his blog.   So this class is really pushing in class projects and experiential learning. Following this approach, in order to introduce students to research, I provided students with the instructions and a lot of structure for their first two projects.

I want to use our second research project as an example. Then, I’ll talk about the pros and cons. The second project was a sentiment analysis of Tweets about a brand I chose and a (realistic but not necessarily real) scenario.

My goals with this project were to teach students:

  1. About computer-assisted content analysis. We focused on how it is different from a hand-coded quantitative content analysis (which was the focus of our first project). And its strengths and weaknesses.
  2. How to do a basic computer-assisted content analysis using Yoshikoder, an easy to use, free App that works on Mac and PC. So my students can use it at home if needed!
  3. About sentiment analysis – what it is, why it is used by organizations to evaluate the online conversation about their brand, and its strengths and weaknesses.
  4. How to write up a research report (In the first project, I provided the project overview and requested results and discussion. In the second project, I added a literature review and methods section, and had them write the research objective and research question).

Why I chose to do this project this way: A number of social media analytics tools today are offering sentiment analysis.  There are also sites like socialmention.com that will provide you with a free sentiment analysis of a search term. But how are these analyses conducted? What are their strengths and weaknesses? Are they reliable? Do they mean anything at all? And what do we need to be careful of before accepting them, and thus drawing inferences from them?

So what I wanted my students to do, was to SEE how a sentiment analysis would be conducted by some of those high-price (or no price!) analytic tools. In other words, I want my students to get their hands dirty as opposed to allowing some distant and hidden algorithm to do the analysis for them. I believe gaining hands on experience with this project provides students a more critical lens through which to see and evaluate a sentiment analysis of social media messages.

The Set Up: I provide in the assignment: The Situation or Problem / Campaign goals and objectives (of an imaginary campaign that is ongoing or happened) / benchmarks / KPIs. In this case, the situation had to do with a popular online retail brand and rising customer complains and dissatisfaction as the brand has grown beyond its core base of loyal customers in recent years.

I provide students with the sample of about 1000 Tweets I downloaded and formatted to play nicely with Yoshikoder. The sample comprises of mentions of the brand. This ensures students are all looking at the same dataset, and streamlines (or eliminates I should say) the data collection process to help students focus on other elements of the assignment. For the sentiment analysis,

I rely on the AFINN dictionary, which was designed for sentiment analysis of microblogs. Students learn about what the AFINN is and a little about how linguistic analysis dictionaries are created through research. Students then analyze the Twitter dataset using the AFINN dictionary to determine the sentiment scores. There is no fancy stats being done here. By checking the sentiment analysis output, they simply determine if their KPI (which was a % of positive Tweets about the brand) was met. In this case, the result they are looking for is a % – so simple division. Not scary at all, no SPSS training needed (that comes with a later project).

They also look at the valence of the sentiment (with a range of + or -5) and explore the meaning of that. The students use this information, along with class lecture, other exercises on how to write research reports, etc., to produce their project #2 report.

Again, to reiterate an important point, we discuss the benefits and of this analysis as well as its real weaknesses. Students always bring up the fact that the results lack context – what if someone used the word “bad” meaning good? What about sarcasm? I show them how to use Yoshikoder to look at Keywords in context as a way of addressing this.

The Benefits and Drawbacks of This (and these types of) Projects As I said above,  I am really trying to move away from lecture in favor of experiential learning. Here are some things I’ve noticed. Some may be benefits, others drawbacks, and others a bit of both…

  • The focus on this project is not on the stats or the analysis and I provide a lot of the needed information – so it makes for a good ‘getting your feet wet’ project that teaches students other important elements of research.
  • It would be nice to teach them more advanced methods of analysis – but I do cover that a bit more later in the semester.
  • Students learn through their mistakes and from my feedback as opposed to me paving the way for them and simply asking them to drive down the smooth road.
  • I provide a LOT of handouts on how to write different sections of a research report, etc. They are detailed… sometimes too detailed and I fear students don’t read them because it is information shock.
  • Sometimes, I wish I had more time to teach them how to avoid the simple mistakes I see in their work, particularly their research reports. I say to myself, “oh man, I thought I told them how to do that.” Or, “Why didn’t you read the handout that explains how to structure this!?”
  • They likely won’t do sentiment analysis like this every again – but at least they’ll understand it!
  • They get to see the results for themselves and get a sense that they discovered the results.
  • Class time is busy – our class rushes by and we don’t always get to cover everything I want to. As a person who likes order and time management, I am having to “let go a little” and let things happen. This is helping me grow. I wonder if it is helping my students though…
  • I know I enjoy doing these sorts of projects a lot more than standing and lecturing, lecturing, lecturing about research. I feel it has made research a lot more “real” and hands on to them.

So that is my overview of the project in general, and some thoughts. It isn’t perfect but it seems to have gone well and I really enjoyed doing it. I’d love any feedback or suggestions you may have to make this the best possible experience for my students. And of course, feel free to adapt, modify, or improve upon this idea.

In an upcoming post(s), I’ll share the assignments (I want to move my documents over to SlideShare due to the pay wall on Scribd). And I will provide some basic info on how to use the Yoshikoder software.

Cheers! -Matt

Just a reminder: There are 2 follow up posts to this post. 1) Looks at activities for this assignment, and 2) provides the assignment itself.

photo CC by netzkobold

Teaching The Applied Communication Research Class

Metrics, Metrics, Metrics! I hear it everywhere I turn. :) More than ever, we need to be teaching our students research skills.

This Spring 2014 semester I am really excited to be teaching an applied Communication Research class!

For two years at Utah Valley University, I taught communication research with an emphasis on academic research. You can see the syllabus for that class. In that class, student groups planned, wrote up, and executed a semester long academic research study. Though many professors don’t prefer to teach this class, research is one of my favorite classes to teach. I’ve had numerous undergraduate students present their research at undergraduate research conferences and earn travel grants to do so. This is a super valuable experience for those considering grad school. Though it is very time demanding, and some feel teaching others how to conduct research is tedious, I didn’t find it that way at all. Seeing students get that “aha” moment in research and seeing them succeed makes teaching the class very rewarding.

This semester, I’ll be focusing on the more practical uses of research with an emphasis on using research for strategic purposes. This class emphasizes research across new media, legacy media, and interpersonal and online environments. Students will learn both quantitative and qualitative methods.

Our textbook is Paine’s “Measure what Matters: Online Tools for Understanding Customers, Social Media, Engagement, and Key Relationships.” I considered the Stacks book as well, but liked the emphasis on new media in Paine and felt her book may be more accessible to students, as students can be intimidated by a research class.

This hands on class will emphasize the following research skill sets:

  • How to conduct content analysis using a coding sheet.
  • How to conduct a computer-assisted content analysis
  • How to conduct interviews and focus groups
  • How to conduct quantitative electronic surveys using iPads

Students will work in teams to conduct 3 applied projects. The first 2 projects are real-world problems I set up and the students have to solve, and in the 3rd project they have to identify a problem, write a proposal, and execute:

  • Media placement evaluation – Answering questions such as, placement, share of voice, and whether key messages are included in media coverage and to what extent. Done via content analysis of media clippings.
  • Sentiment analysis of social media content – What are people saying about your brand on social media, and what is sentiment towards it? Done via computer-assisted content analysis of Twitter posts.
  • Audience Research – Focuses on 1 of the 5 key PR variables discussed by Stacks (2011): Confidence, credibility, relationship, reputation (which may include awareness), or trust. Students will choose 2 of the following: interviews, focus groups, and surveys.

Students will be introduced to the following software:

  • Computer-assisted content analysis (Yoshikoder will be used as it is free and easy to learn)
  • Digital Survey programming with XLS Forms
  • Open Data Kit Collector – field data survey collection software (we will be using this with the XLS forms on the free FormHub.com online form tool).
  • SPSS – We won’t get too far into SPSS due the other demands on the students time, but students will learn data entry, descriptive statistics, and correlation analysis.

I’ll be posting the syllabus for the class soon! As the semester goes along, I hope to get up a number of blog posts expanding on the class, assignments, and so forth. So check back!

Have you taught research – what do you emphasize in your class? How can I improve my class? What key skill sets should we be teaching  future practitioners?

-Cheers!

-Matt

- top photo CC by IntelFreePress

“Social Media and Mobiles” Social Media and Politics Research Published!

I hope everyone is staying warm! Here in the Eastern Panhandle of West Virginia, we’ve got some terribly cold weather heading our way tonight!

I want to take a moment to share some news from the research side of my life in academia. :) As you know, I research social media and civic and political participation.

I’m very excited because this past Friday, my latest co-authored study was published online in the journal New Media and Society.

This study, “Social Media and Mobiles as Political Mobilization Forces for Young Adults: Examining the Moderating Role of Political Expression in Political Participation,” is an extension of our earlier articles: “More harm than good? Online media use and political disaffection among college students in the 2008 election” (2013) in the Journal of Computer-Mediated Communication, and 2010’s Mass Communication & Society piece, ““Did social media really matter? College students’ use of online media and political decision making in the 2008 election.”

Social Media and Mobiles really seeks to further investigate the seemingly important role of online political expression (such as posting political videos to YouTube, Tweeting about politics, or posting to Facebook, etc.) in political participation. Particularly, the study looks at what role online expression may play in moderating any effects of political media use on participation. Additionally, this study investigated political smart phone app use, something not investigated in the prior two studies.

Here is the abstract:

A web survey of college students was conducted to examine whether online political expression moderates the effects of political media use on political participation. Results showed that online political expression enhanced the effects of political mobile apps, traditional offline and online media, and social media on political participation. Implications are discussed for a mobilizing role of online media in the democratic process for young adults.

You can see my other posts on social media research.

Cheers!

Matt

photo CC zoonabar