Tuesday, July 1, 2014

New App Idea: "Running Early" (#15)

People are often late.  Even very bright individuals have a voice in their head that says, "we'll be fine, we can wait…" and don't leave enough time to get where they need to be  Some employ tricks, like buying Clocky, the alarm made at MIT that runs across the room.  Others turn the time forward in their cars to fool themselves into getting more places early.

But after a while, we know our clocks are two or four minutes early, and the benefit is gone.  How can we reignite that fire many people need to be on time?

I will make Running Early (working title).  It will periodically and randomly adjust the time on your mobile devices (and ideally, your smart car) so that they are between 0 and 10 minutes fast.  My bet is that you may not give yourself a bigger buffer if you know your clock is 5 minutes fast, but you will compensate if your clock is either 2, or 5, or 8 minutes ahead.  Running Early will sync across devices but stay out of the way, subtly encouraging you to get out the door and to your meeting a few minutes early.

In a perfect world, we wouldn't need Running Early.  We should have plenty of responsibility and self-control.  But according to everything ever written by behavioral economics, we just don't.  We know people like being a few minutes early, to be able to walk instead of run before an appointment, to sit and check their email or just enjoy the day.  And yet they don't get this enough.  Running Early removes cognitive dissonance from your life, giving you a handful of minutes you didn't know you had, and resulting in a happier and more prompt you.

The app's first screen would have a big green button that activates the time randomization, with no sign-in required.  On Android I would have a 1x1 widget, similar to the "Data ON-OFF" app.  One can always drag up from the bottom to peek at the correct time (I would have it like this because knowing the true time ruins the power of Running Early, so consumers should look at it quickly and rarely).

Monetization strategy #1 is to patent the time randomization and license or sell it to OEMs (if you have a friend or relative in intellectual property, please send me their contact info!).  If I can't do this, monetization strategy #2 is to expand the feature set so that either Running Early builds a brand and user base, or so people will be willing to pay for a premium version.  Additional features could include a check-in button on the Android notification tray, which would, if synced from the user's calendar app, generate a report showing how early they were, how their on-time rate varied by randomization settings, weather, and other items data geeks might enjoy.  Integration with Foursquare would allow the user to log when they check in.

A patent would be ideal.  But if I had to build a brand, I would connect Running Early to blogs about procrastinations, behavioral economics, building good habits, etc.  The app would be linked to a lifestyle, a choice that says, "I'm not perfect, but I'm resourceful and smart enough to work on my lateness".

What do you think?  Would this app add value / improve punctuality?  Would people beyond me and those I've spoken to actually download this app?  Thank you for your suggestions!

Thursday, April 10, 2014

How to Drive Clicks and Grow Your Business…Just Kidding. (#14)

An Ordinary Guy’s Lessons Learned in Boosting Web Presence.

Full disclosure: I’ve been doing experiments on you.  Don’t worry, it’s nothing sinister.  I’ve been shortening links I put on Facebook and Twitter, because then I can track how many times it’s been clicked on and when.  It’s all anonymous, but interesting nonetheless. 

As a guy beginning a career in data analysis, I believe that virtually all actions we take can be quantified, especially online.  Any company that is not collecting data, and learning or finding relationships in it, is missing out on ways to do better at what they’re doing.  So I track, and so can you!

Hi Orli!
Social media: you're doing something right.
When we’re being honest with ourselves, all behavior on social media is at least a bit narcissistic and self-centered.  If you didn’t care at all who saw and engaged with your updates, you would write a diary instead.  I try not get wrapped up in like-baiting. After all, social media is fun and shouldn’t be taken too seriously.  Yet, it would good to have a grasp on how people behave online if I ever go into marketing. In my job I will learn how to be statistically rigorous and all that jazz, but for now I’d just like to share my anecdotal findings about how/when/where I post.  Here they are:

1. Conciseness works.  I used to give a full opinion on the articles I tweet, barely staying under the character limit.  But, I’ve seen time and time again that less is more—if I just say “this is the future” or “a strong argument”, I’m more likely to be noticed in the feed.
2.  Headlines matter.  This may be obvious, but at least I’ve confirmed it myself that a call to action, with important words in the front, leads to more engagement.  There will be consequences if your content continually does not live up to its headline, but at the basic level you need something that drives another person to care.

3. Facebook accounts for at least 80% of my social reach and network.  My peers spend much, much more time here than Twitter, Google+, Tumblr, LinkedIn, or anywhere else.  An average article I post on Twitter gets 1-3 clicks, but on Facebook even silly stuff I put on Facebook gets more than this.  Of course, I’m not a Twitter or blogging star, and imagine this pattern would reverse if I made a greater effort to attract people I don’t know on these platforms.

4. A retweet, like, comment or +1 soon after posting significantly increases reach.  Facebook, Twitter and Google’s algorithms may be complex, but clearly engagement increases the amount of times your stuff pops up on a screen.  Not only this, social proof takes hold, and people want to be in on what others around them know about and have verified is good content.

Today (this was the basis for my post) I linked an article where people mined eHarmony data and found people look to date those quite like themselves, even if they say the opposite.  Pritika, a friend with a near-superhuman network of friends (Hi Pritz), commented one minute after I posted an article, saying “This is so interesting. thanks for sharing J.Lew”.  It was the perfect comment to get others interested. In fact, if I did social media for more than fun, I would agree with my partners to comment on each other’s content, because it works.  In a couple of hours I had over 70 clicks.  Another factor is the topic, which fits college students like: readable but also potentially informative stories. Ultimately, the goal is not to get clicks for their own sake but to tailor my stuff so that my friends enjoy my web presence.  No one likes the guy posting about Ayn Rand three times a day.

5. Don’t post at 2 in the morning. Even those who rarely use social media would agree yelling to someone is better than yelling to no one.  Wait until 11am for Facebook, and 1pm for Twitter.  I started posting my #MusicMondays updates (check out my blog post on this! Or not) at around 2 pm, as the internet tells me that is the peak of web traffic.  In time, I’ve learned my demographic is at least as active around 8 or 9.

6. I don’t like to engage in Facebook political wars anymore, because I’m not 16.  But every once in a while, I ask people what they think about the news.  As a rule, if you want comments (or retweets, or whatever), you should ask for them.  As serendipity would have it, a post about the ACA earlier this year began a long discussion, bringing in the views of most of my different friend groups.  I returned to the computer that evening with shares and an absurd amount of clicks, probably my highest ever.  This is social media functioning well: content spread which inspires people to stop scrolling, to think and respond.  Or maybe I got lucky.

Let me know your thoughts.  Is casual analysis of your social media a waste of time?  Is there an easy way to do this more scientifically?  If it worth a few seconds of extra effort to post better stuff on average?  

Tuesday, December 31, 2013

The Case of the Missing Battery Life (#13)

There are a few important things in life that drive in us, but that does not belittle the many small things that irritate the hell out of us.  As we shift into a connected, device-centric world, battery life has not caught up to our usage, and having to constantly glance up at the 8% in the corner of our smartphones and laptops has became an ever-present source of stress that really shouldn’t have to be there.
My first smartphone is a Samsung Galaxy S3, a popular phone and a capable one.  However…the battery life is horrendous.  While society has reluctantly assented to a world where connected devices need to be charged every day (it won't always be this way), mine couldn’t make through a day to save its life.  Sometimes I’ll watch a Youtube clip and watch 20 or 30% ooze out of my battery, or leave it and return a little while to 2% left with no idea why.  What’s worse, an entire night of charging (which for me is five or six hours) doesn’t get me close to 100%, and I’ve tested at least four charging cables to confirm this is my phone’s problem.
Yet, I won’t resign myself to a frustrating 2nd year with my first smartphone, and instead have committed to solving The Case of the Missing Battery Life.  Hopefully this will be helpful to other Android users as well.
My first task was to identify the largest source of power, and what I’ve found most people have discovered in their own quest for greater longevity: Wi-Fi, 4G, vibrating, and screen display (especially for big ‘ol Samsung phones).  So now, Wi-Fi automatically turns off when I leave home or school, and 4G I manually turn on and off for when I need it.  My phone never vibrates anymore, brightness level is “squint”, and the phone goes idle pretty quickly.  I’ve been careful to do this as scientifically as possible, watching the effect of one variable at a time, and the screen tactics have been the most successful. But there’s more to be done.
You know how Android can multitask extremely well (take that, iOS users)?  Well, that has a cost, and it’s sweet, sweet power.  Not only do I clear the recent apps screen, but use a task manager to force quit the apps that really don’t deserve the CPU (games and seldom-used apps that constantly auto-download updates in the background).  This has gone a long way.  At the end of the night two days ago, I still had 14% (I live in the red zone).  Android gives you a lot of tools to diagnose its dismal battery conservation—I was in for a surprise when I checked the battery usage breakdown screen.  Highest with 22% is Google+?!?  I immediately entered the Google+ app (there’s a first time for everything) and checked off everything I could. 
Working on a project downstairs and outside and happy to be disconnected, I actually didn’t use my phone yesterday.  When I returned I expected my problems to finally be over—I’ve systematically dealt with every pesky obstacle to good battery life.  20% lost idle over 6 hours?  I can deal with that. But strangely, Google Search is now taking 37% when idle (I haven’t searched anything!), and other vague “idle” processes (which I can’t mess with) total 2/3s of my usage.  What gives?
My current hypothesis is that mother Google is checking in constantly with my phone, collecting data on emails and updates, even when I’m not doing anything.  Even when I disable background apps and updates and its minions + and Search, the data-grabbing I’ve blithely ignored has been draining me. 
I know Android exists to collect data, but I want to know if others have noticed Google scroogling with their device experience.  Comments appreciated!

Tuesday, November 5, 2013

Genres, Words, and Communicating Musical Taste (#12)

I started posting #MusicMonday's, in the hopes that I engage some people out there to discuss, listen and share music with me. Music is better together.

Here are my recommendations, if you're interested.  I'd highly appreciate thoughts:
1. Jose James and Emily King - Heaven on the Ground (http://bit.ly/1gjSsjR)
2. Snarky Puppy - Too Hot to Last (http://bit.ly/19JnACa)
3. Gretchen Parlato - How We Love (bit.ly/186isbH)http://http://bit.ly/186isbH
4. Kimbra - Settle Down (http://bit.ly/19NJgAy)
5. Robert Glasper ft. Norah Jones - Let it Ride (http://bit.ly/1djlphh)
6. Gotye - In Your Light (http://bit.ly/1czY9eo)
7. Lettuce - Break Out (http://bit.ly/1aNuFCm)
8. Q-Tip - Johnny is Dead (http://bit.ly/1e6m7uf)
9. The Bad Plus - Everybody Wants to Rule the World (http://bit.ly/1gygW7W)
10. Stan Getz - Wave (http://bit.ly/1fBXLtY)
11. Gretchen Parlato - Holding Back the Years (http://bit.ly/1dxtaA4)
12. Childish Gambino - Telegraph Avenue (http://bit.ly/1ciNHRK)
13. Frank Ocean - Sweet Life (http://bit.ly/1985oJc)
14. James Fauntleroy - Fertilizer (http://bit.ly/1iKjvVA)
15. Ratatat - Cherry (http://bit.ly/1eNVH0l)
16. Snarky Puppy - Something (http://bit.ly/SnarkyPuppy1)
17. Nujabes - Feather (http://bit.ly/1esKoNm)
18. Hard Jazz - Greg Spero (http://bit.ly/gregspero)
19. Pat Metheny - Medley (http://bit.ly/1gcheAi)
20. Robert Glasper - Butterfly (http://bit.ly/1duP4lU)
21. Joshua Redman - Let it Be (http://bit.ly/RedmanLetitBe)
22. Chance the Rapper - Cocoa Butter Kisses (http://bit.ly/ChanceCocoaButter)
23. J Dilla - Lightworks (http://bit.ly/1iU2Yzi)
24. Snarky Puppy - What About Me? (http://bit.ly/1lGxiyd
25. Bill Evans - My Bells (http://bit.ly/BillEvansSymphony)
26. Kanye West - Champion (http://bit.ly/KanyeChampion)
27. Russ Kaplan - Gouge (http://bit.ly/1rG8skR)
28. Nujabes - Latitude (http://bit.ly/1pbNkEg)
29. Gabe Dixon - Strike
30. Jose James - Without U (http://bit.ly/JosejamesWithoutu)
31. Funky Knuckles - Shields of Faith (http://bit.ly/1nYkvpJ)
32. Isaac Hayes - Shaft (http://bit.ly/1of6xQv)
33. Soweto Kinch - Good Nyooz (http://bit.ly/1p8SZch)

But this leads to a topic for reflection: I've been having more trouble as of late communicating with others about music.  I believe this is mostly on me because I keep changing while in aggregate nothing has changed in the college music scene in the past year or two except maybe a leveling off of our dubstep fixation.

Background for those who have known me for a long time: I'm a jazz guy by trade.  That was what I learned when I started the piano at 13, what I went to programs and did regional bands with in high school, and was the bulkload of my listening, writing, and musical discussion through age 19.  I loved it.

But I don't define myself by it anymore.  Growing up we're placed on pre-defined musical "tracks", the main ones being classical, jazz and rock.  With few exceptions, private teachers will focus on one of these three, schools will offer classes and ensembles in just the first two, radio stations will label themselves one of the three (or pop, but this genre's lowest common denominator nature is definitely a topic for another day), and communities formed under one of these three umbrellas will have a much larger following.  Initial interest put me in the jazz track, but I could have been happy if there was a strongly-supported "funk", "soul", or "jam" track instead.

One of the most important things I learned in college was to be more open-minded (and open-eared) and to seek diversity of knowledge.  But, while genres serve a clear purpose of providing catchall terms and pointing listeners in a general direction, they put up arbitrary walls.  Just like, for example, 2-party political systems or college majors, those in between labels are unintentionally marginalized and encouraged to conform.  We lessen the problem by creating more, more amorphous labels like "ska" or "adult contemporary", but none of this makes it easy when I am asked the classic ice-breaker, "what kind of music do you like?".

This gets at the big question: How do get our music preferences across (to both the musically-passionate and dispassionate) without pigeonholing ourselves into genres, using cliche descriptors like "acoustic", "hard", or "funky", or coming off as totally snobby?

First, I recognize it's not crucial that I be perfectly understood anytime sometime asks me about music.  But, for the segment of the population who loves music and talking about it, I'd love to be able to explain more precisely than:

"It's kinda like jazz and funk and soul and modern, usually dense harmony, with heavy or occasional improvisation, riffs, smooth voice leading..."

And remember, I'm trying not to come off as haughty or trying to prove something.  That's the hard part.  But 2013 labels simply don't do Jose James or Gotye or Lettuce justice.  Here's the common iTunes designations for my recently played music of the past 18 months:

Jazz - Funk - Modern Jazz - Alternative - Hiphop - Jazz - Blues - Classic Rock - Soul  - Jazz/Funk

This vocabulary is limited and not particularly helpful.  

Here has been recent approach in reaching a "meeting of the ears" with friends: wait and listen to 3 artists they like, and play first the midpoint between your musical comfort zone and theirs.  For example Katie played rap, but seemed to like both substantive/"real" lyrics and intense harmonic grooves, so I played Robert Glasper (who's totally awesome)  Adam came from a traditional jazz track, but also has a soft spot for female pop singers and updated/messing with the trad jazz formula, so I passed on Gretchen Parlato.  My roommate JFran is an interesting case.  He's one of the only people who listens to substantially more music than me, but Matrix-dodges my ability to label it.  He likes the rawness of local bands, electronic layers and integration, bands that sound like the Beatles were shoved in the 21st century, ambiance, acoustic guitar he can strum along to, and more (here's one example).  I went out on a limb with this by the cool folk group Bad Books, and we're slowly reaching some musical overlap.

So, my (reasonably practical) recommendation to everyone else who enjoy music: have 3 artists on the top of your mind that represent you, and have a shortcut to play them on your phone.  If everyone has music players with them at all times, we might as well stop confusing each other with words when sounds are worth a million of them.

Monday, October 21, 2013

Faith in Microsoft Restored - My Impression of Windows 8.1 (#11)

(If you missed my earlier thoughts about Windows 8, you can find them here.  There is a noticeably different tone on that post vs. this one.)

It has been exactly a month since I joined Windows 8 World.  I know this because my Office free trial ended, destroying my productivity for the day.  I've been getting good with integrating little new features while still generally ignoring Microsoft's Metro vision for how I should use my PC.  I got around to unclogging my start menu, though I Windows+D straight to desktop as a reflex anyway.  I dropped my files into Skydrive and have found most of the capabilities of Office 2013 that I already knew about on 2010. I even started using the touch gestures; sometimes it's easier to use Skype or my .pdf reader as an app if I can just swipe in an out.  I don't think the cursor's days are numbered, and my Metro start menu is still ignored, but I learned to at least clean house within my Microsoft-imposed hell.

And then Windows 8.1 happened, and my faith in Microsoft to listen to its consumers was restored.  It's not like anything was that different.  The famous start button returns, the snapping tool gets more flexible, the start menu has new live tiles and sizing options, and the Windows store isn't a mess.  On the first day, I went about business as usual, rather underwhelmed.

But then I accidentally snapped a news app in, and realized I could resize it and move it around. Now I can actually meld Metro-world and desktop world, customizing my current screen and multi-tasking unlike how I ever could pre-Windows 8.  And then I decided, "If I'm going to actually use some programs as an app to take advantage of this cool view, I might as well organize my start menu."  My search for good icons brought me to the Windows Stores, which now prominently shows ratings and recommendations, while still feeling like there is more space than before.  I downloaded apps in a frenzy, treating my PC like a phone and fiddling with the layout (I'm pretty OCD when it comes to organizing technology) until I could conceivably spend most of my time in Metro mode.  Let's be honest; most 21-year-olds could survive on just social media, Chrome, Office, Amazon, Steam and Netflix for about 90% of their computer usage.  I noticed hundreds of apps that were probably in existence before 8.1, but ones I always ignored due to that small initial hurdle of confusion.  There's a beautiful little tool called Piano Time.  I was pressing pretty 6-note chords more easily than any fake piano I've tried.  And all the apps were waiting with a "NEW" tag in my All Apps area, now reachable with a swipe downwards on the touchpad.  I felt that "power-at-your-fingertips" feelings, where everything seemed intuitive and did what I wanted it to do, unlike how I felt the entire first 29 days.

All it took to make me realize that Windows 8 wasn't awful was a little more ease of use.  Just a slight push to get me sledding down Metro Mountain.  My main thought now is: couldn't anyone have told Microsft to make its tools more flexible?  Wouldn't serious consumer testing have shown these minuscule changes would be worth making?  And if Windows 8 was so clearly inadequate and unable to change consumers, why did it take a year to polish?  I wouldn't mull over these questions, as you'll just get frustrated.
This guy is probably feeling good
for the first time in a while.

I could tell on first boot-up yesterday, when a tutorial spoke to me like an adult, but also didn't assume I've been sitting in a Washington office with Steve Ballmer for a year, that Microsoft was listening.  But I'd like to see the company continue to update its UI at least every few months.  Google and Apple make near-constant fixes, and users expect no less.  With 8.1, Microsoft now has the software of 2013 to match its hardware of 2013, now it just needs the customer service and responsiveness of 2013.

What do you think about Window 8/8.1?  I'd love to hear your comments below!

Tuesday, October 8, 2013

Netflix, eBooks, iTunes - The Gang's All Here (#10)

How the digitalization of content resized all of our media, and what will happen next.

Twenty years ago, TV shows were 22 or 44 minutes only.  Best-selling books were 150-300 pages, newspapers were the size you could comfortably read with two hands, music albums 30-60 minutes, and movies 90-150 minutes.  Why?  Money and form constraints, mainly.  Content of all forms followed size patterns so that it felt substantial and worth paying for, but could fit on a VHS, or a CD, or be bound. 

Then the commercial internet was available, and everything changed.  Well, not immediately.  Albums and movies and books were still usually the same length.  It was still easier for consumers to start TV shows at the top of the hour, and invest in a book around 200 pages (i.e., not American Tragedy). It was what we were used to.

But then we got creative.  As the web matured and became more complete, we found no reason to restrict ourselves based on form. With virtual folders and 500 gigs of hard drive space, who cares whether an album is ten minutes or ninety?  Visual stories could be told all at once in a movie, but also five minutes at a time on Youtube, or just 5 seconds at a time.  I post to my blog ~500 words at time: no binding or publishing costs required.  Netflix throws its content at you seasons at a time, and what we can't find there we download by the season.  Digitalization means 'one size fits all' has become 'all sizes fit me'.

This is not news, per se, if you lived through the late 20th and 21st centuries.  But what I find most interesting is how entire industries have grown or gone away as a result of this paradigm shift.  Our binge-watching culture has given birth to a recapping industry; journalists share insight about a TV episode, and people who are at the same point find each other and discuss in the comments.  TV themes are shorter or hookier; no one needs to know the story of the Brady Bunch every time when they are watching 9 episodes in bed.  One of my favorite albums, Pat Metheny's The Way Up, is one 68-minute composition-no flipping of the record or tape or CD or even tracks were required.  News doesn't resemble itself at any time in history.  There are 13 million archived New York times articles but 17 billion Tumblr blogs.  I consider Twitter Vines (6 seconds or shorter videos) an art form, and film majors are actually getting discovered through Vine microfilm competitions.

The possibilities are great; here's a good example of new technology changing art.  I'm working on a project for class where I am helping E-publish a women's book.  Her non-fiction work about communist Russia spanned many regions, fields, and decades, and she described it as difficult to find the right order to present the information.  So I thought, why do we need a linear story?  Why not make a web of connected stories as they are in real history?  We could have a story about Lenin lead to stories about the 1920's, or to other revolutionary leaders, or to his faction, the Bolsheviks, depending on what reader wants to see next.  We could even visualize the story in a way books couldn't, with hundreds of embedded photos or videos.  And it could all be on a Kindle or iOS app.  I'll fill you in more as the project progresses, but you get the idea.

I fully expect TV shows to abandon the idea of standard episode lengths in the next five years.  The Kindle market has already done this for books.  As CDs die out, expect more and more artists to ignore the restraints every generation of musicians before them has felt as they try to record and promote themselves. Say goodbye to big-budget recording and music stores, say hello to Garage Band and SoundCloud.  Soon, no major human event, even in the developing world, will go unrecorded.  Expect a world where there is no "normal" media, few artistic standards, and no limits.  Are you excited?