In today's feature spotlight we are going to highlight how you can take advantage of automated reward fulfillment. Using this fully integrated and automated service within Survey Analytics you can save a lot of time, resources, money and worries associated with being accurate and timely fulfilling community rewards. The points and rewards work seamlessly providing a "multi-channel" brand experience between both online and mobile communities. This opens up time to focus more on your data and deliver targeted content and draw more accurate conclusions. Why should you use rewards and incentives within a panel or community? Rewards and incentives open the door to two way brand communication. This helps build user engagement, motivate respondents knowing they receive goods for their time - and in the end it builds strong loyalty and retention. Customers enrolled in a loyalty program visit 2x more often and spend 4x more money! Retailers with a loyalty program are on average 88% more profitable than competitors who do not have programs. (Source: Fivestars)
Thursday, December 19, 2013
Thursday, December 12, 2013
Today we tuned into a webinar titled "From Girls to Women: New Horizons in Market Research" hosted by Dana Stanley, VP of Market Research at FashionPlaytes and Kristin Luck, Founder of Women in Research (WIRe) and President at Decipher, Inc. The webinar was co-presented and begain with Kristin providing an overview of the impact women hold in brands, business and in the market research industry. Dana followed up with some new insights into tween girls and girls that are a part of the tech saavy and always connected Generation Z. FashionPlaytes also shared how they are engaging with this hard to reach audience by using gamified elements of research both online and through a mobile application. Here are a couple of great insights we gained knowledge around from tuning in to today's webinar.
#1: True Brand Intelligence is Female Informed
- 83% of consumer purchases are made by women
- 51% of personal wealth in the US is held by women
- Women are the primary decision maker for HH purchases
- Majority of advertising targeted to women is still created by men
- Female entrepreneurs are 15% more likely to be profitable, yet 50% less likely to get funding
- Firms with 3 or more women higher up in top positions score higher than their peers
- Greater diversity breeds more effective and profitable products and services
- The market research industry suffers a lack of diversity
- Less than 15% of the US Honomichl 50 companies are run by women (<1% globally)
- Women in research make on average 20K less than their male counterparts
#2 Mobile Engagement is the Key Moving Forward
Much of Generation Z does not possess their own e-mail address. Those that do have e-mail address do not check them very frequently. As this generation begins to grow their social profile, we need to adapt to their preferred methods of communication. Parents of "FPgirls" are required to give their consent for their tween girls to use the mobile app and for FashionPlaytes to gain higher levels of detail in their research. The connected generation expects a two way dialogue of communication from brands. Generation Z is accustomed to touch, highly opinionated and there is a smaller window than ever before for capturing their attention. It's not just old school research and surveys anymore - it's finding fun ways to engage and letting the voices be heard through mobile and connected channels.
|FashionPlaytes "Club Z"|
FashionPlaytes knows their audience well and they know the things they want. Aside from surveys, there are quick polls, forum boards and areas where girls can get tips on style, school, fitting in and more. The FashionPlaytes app will incorporate a unified rewards system that incorporates both research and non-research activities. Some examples of prizes given are back to school fun packs, iPad minis, and last but not least One Direction "1D" goodies. Notifications to take surveys and quick polls are going to be seen throughout the site and through push notifications on the mobile app (currently in development). Kristin Luck capped off the webinar reviewing how important mobile is becoming in market research and if you haven't already - learn how you are going to integrate mobile for 2014.
#3 Tween Girls LOVE Google!
Tween Girls on Facebook?
- 38% Thats for my parents
- 18% Love it!
- 18% Like it
- 21% Heard of it, but don't know much
- 6% Don't know anything about it
Tweens and their thoughts on Google
- 69% Love it
- 26% Like it
- 4% Heard of it, but don't know much
- 3% Don't know anything about it
Tween Standpoints on Skype
Wednesday, December 11, 2013
This Black Friday we used our powerful smartphone survey/research platform SurveySwipe to collect passive data while shoppers completed our mobile survey. Passive data collection is collected through a mobile application and runs in the background with the users consent. Data is collected from Smartphone sensors helping you analyze how people are using Smartphones in real life scenarios. The data is collected ongoing based on the triggers and variables you set to collect. Some of the variables that can be collected through passive data include: accelerometer, GPS (location tracking), app census, app usage, battery life, memory, 3G/4G network connection, wifi connections, disk space, gyroscope, IP addresses, retina display, SSID and operating systems. By collecting apps running and operating system variables we were able to conclude the following items from our passive data collection study.
#1: Apps Running While Our Survey Was Being Taken on Black Friday 2013
Amazon, Facebook and Google Chrome were among the top applications running while respondents completed our Black Friday 2013 survey on the SurveySwipe mobile application. Being Black Friday, it was interesting to see that Amazon was among the highest running application (11%) at the time of the survey being taken. Just behind were Facebook (9%) and Google Chrome (9%). We can assume that users were shopping on Amazon for deals and comparing prices throughout their Black Friday shopping experience. IBM's Benchmark featured on TechCrunch concluded Black Friday 2013 online sales were up 20% and that 25% of online traffic on Black Friday was contributed from Smartphones. As we all know, many of us close our mobile apps when we are not using them. Apps that track your location or send frequent updates and push notifications have been deemed notorious for draining the users battery life. At a larger scale of data collection and triggers being set to collect passive data more frequently, these numbers could very well increase.
#2: Respondents Operating Systems From Our Black Friday 2013 Survey
From the chart above, we were able to conclude that 82% of mobile survey respondents were using an Android Operating System while only 18% of survey takers were running iOS. On the chart, 8-19 are Android OS indicators and 612-704 are iOS indicators. Android 16 (29%) was the most widely used OS, followed by Android 15 (16%) and iOS version 7.0.4 (10%). Knowing the operating systems is important to know areas where you may be falling short. For instance in the next study we perform, we will want to find a way to recruit more iOS users to download the application and complete our mobile survey. Measuring the operating system is also a great way of measuring retention and engagement with our audience based on their operating system. Over time measuring the operating system values you can begin to answer questions such as:Are Android users more likely to return to complete future surveys? Are iOS users harder to engage or more likely to delete the application and taper off?
Learn More About Passive Data Collection
Check out the replay of an educational webinar presented by Survey Analytics CEO Andrew Jeavons titled "How to Collect Passive Data from Smartphone Users." The webinar provides an overview of passive data collection and the variables (data points) that can be collected from Smartphones. Great use case scenarios in multiple industries are covered as well as the concerns about security and privacy from this data being captured. Passive data can be collected through the SurveySwipe application, through a white-labeled SurveySwipe app or within your own existing application through API/SDK. Implementation methods, reporting and the business decisions that can be concluded from this data are also covered in the webinar.
Thursday, December 5, 2013
Survey Analytics VP of Client Services Esther LaVielle took a moment yesterday afternoon to answer questions we did not have the opportunity to get to. Thanks again to everyone that joined us! If you missed out on this presentation, you can access the replay here.
Q1: What all checks are available in Data Validation?
Esther: In terms of data validation for both the Conjoint and Maxdiff after a study has been concluded you will be able to access both an online report as well as all of the raw data. You will be able to see what’s been displayed and what was picked and draw your own conclusion if needed.
If you are speaking of validation requirements prior to deploying a survey you can access our survey overview page which highlights any errors in logic skip patterns prior to deployment. The best thing I would recommend is to run a test among your team and also colleagues who are not familiar with conjoint to solicit feedback on the user experience to give you time to clearly define any features or attributes you are using for either question type. All test data can be deleted prior to deployment and will not influence your final results.
Q2: Which is more beneficial - Conjoint or Maxdiff?
Esther: My personal opinion is that all questions, even conjoint and maxdiff, are beneficial so long as people can clearly understand and answer it promptly. Each question type is unique in it’s data output so it would depend on what your end goal of your project is. Are you looking to figure out what new products or services to sell at certain in 2014? Then Choice based conjoint would be a good fit. Maxdiff, on the other hand, can be applied to many areas of research including brand preference, customer satisfaction, product or service features, and even message or image testing. The good thing about Maxdiff is that the list of items you want to present to the respondent does not.
Q3: Can we create conjoint in Survey Analytics and call it in mail survey in confirmit?
Esther: We offer survey URL links and where you post them is on your own discretion, however, we do not integrate with competitor software. You should be able to have the capability to run a comprehensive survey directly in our platform if required.
Q4: Hello - I missed the beginning of this, I'm sorry. WIll this be more about the application of Conjoint and max diff, or about how to set them up technically speaking?
Esther: Yes, towards the end of the presentation recording at about 40 min in, you will see a demonstration of how to set up both question types in our system. If you would like to run an evaluation on our site, please feel free to sign up for an evaluation license here: https://surveyanalytics.com/a/showEntry.do?mode=surveyanalytics.
Q5: Output from the simulators is preference share, not truly market share. Does the model make assumptions (or does user have a choice) to get from preference share to a truer estimate of actual market share?
Esther: We use the term market share as an estimator based on the data that’s been collected. In this case the term preference share can be interchangeable as you are simulating the idea of showing particular profiles to the respondents and based on what they’ve answered we have the output. If you are speaking about the true definition of market share then using an estimation tool based on the data received will not achieve this. Definite spot-on accuracy of market share is not possible. If anything it will give you an idea of potentiality within the market space of the product/services in consideration. •
Q6: When should you use MaxDiff v Discrete Choice?
Esther: You should use conjoint analysis if you are looking to gather information on potential market share of new or existing products/services you are looking to make money on. It’s been used as part of a pricing research strategy as well as test new ideas prior to determine if investment and resources should be poured into it. Maxdiff can be used a part of many different types of research to determine preferences among your attribute list. You can use it in customer satisfaction studies, brand awareness, product or service, or message/image testing. Anytime you are faced with big rating scale you should ask yourself if possibly using a maxdiff question type would give you insights that you may potentially miss with standard questions.
Q7: What is the pricing model.. Is it an annual license?
Esther: We offer monthly and annual licenses here at Survey Analytics. Please feel free to reach out to our sales team to get you started with an evaluation license or discuss pricing. Salesfirstname.lastname@example.org Phone: 1-800-326-5570
Q8: Does your platform integrate with Confirmit and Qualtrics?
Esther: We do not integrate with competitor software. You can easily create and deploy a comprehensive survey to be deployed across all channels within Survey Analytics. We have a free trial available if you would like to further explore the tool and what it offers: https://surveyanalytics.com/a/showEntry.do?mode=surveyanalytics.
Q9: How do you define "homogenous" for d-optimize?
Esther: We use the Merriam-Webster definition. http://www.merriam-webster.com/dictionary/homogeneous
Q10: Can you do a market share simulation with MaxDiff?
Esther: For conjoint, yes we can, but for Maxdiff that is not typically done. However it’s a great idea and something I can speak to my head developer and operations manager on. If you would like to send us more information as to why you think this would be valuable to add please contact me directly at email@example.com. It would be great to speak to you about it.
Q11: What is your experience in terms of the % of people who are invited that will actually participate?
Esther: Well, in any research project we know that about 50% of your budget is allocated to recruitment, sample finding, accessing and managing a database of willing participants, and incentives. The better you are at managing the respondents who fully understand what is asked of them, and the more clear and concise the survey is, then the higher the response rates. Other things like incentives, timing of survey deployment, survey topic, follow up reminders, and etc. will affect your response rates as well.
If you are interested in conjoint and maxdiff in Survey Analytics specifically I will give you a few tips that can help:
- Make sure your features and levels are clear and easy to understand and in the language of your target respondents. For maxdiff, make sure the attributes are easy to understand and if you need, limit the number shown per page.
- Set up definitions, photos, instructions, and tips within the conjoint or maxdiff or prior to getting to the conjoint so they know what is expected of them.
- Too man conjoint exercises will result in respondent fatigue and will skew your end results. Make sure to keep it as concise as possible. If you are using Doptimal design and it says to run a larger task count, make sure you are clear with the respondent how many you questions you must run by them, and let them know the incentive is directly correlated to their responses for these questions.
- Target consumers who WANT to take this kind of survey. Pre-screen if necessary.
- Use all follow up tools to remind those who have not completed to take the survey. Survey Analytics offers a reminder scheduler that you can set up ahead of time with a message targeting those who need to complete the survey.