Alright, Try To Break It
Date: September 27, 2016
Posted by:
“Alright, try to break it. Let me know if you can reproduce the problem.”

“Try to break it.” That is really different from “testing” an asset. My team and I have tested a countless number of quizzes and custom assessment over the last year. When doing so, we’re looking for any kind of design/editorial mistakes, functionality errors, form malfunctions, etc.

Until this past week, testing was our way of making sure the asset worked before handing it over to the client.

I found out that one of our live quizzes was not working properly early in the week. The user end was fine, people would take it, get their results and move on. The client end, however, was not. After blasting out an email to a rather large list the client went in to look at their leads, and 90% of them were blank. Cue the alarm.

Photo of the backend

When I signed in to the client portal I could see exactly what they were referring to. Clearly, people had been taking the quiz (upwards of 300 actually) and we could see what result category they fell into but, for most, we had no lead gen information.

The first thing I did was go back and test the quiz. I clicked through the whole thing, filled out the form and hit submit to reach the results. And guess what, it worked. My information popped up in the portal just as it should have. At that point, I got on the phone with my developer and started to explain what I was seeing. His response, “Alright, try to break it. Let me know if you can reproduce the problem. I’ll do the same”

That was my lightbulb moment.

I wasn’t exactly trying to break the quiz  when I was testing it. I was taking it and making sure it worked correctly. The downfall there is that by the time we’re testing, we know the quiz inside out. I barely have to read the questions to know which are ‘select all that apply’ and which are ‘choose up to 3’. I already know what form fields are required and what results buckets lie behind it. I had never attempted to take the quiz incorrectly.

I started trying to reproduce the error. As I mentioned though, I was too close to the asset so I sent the link around to my coworkers on other teams and asked them to run through it. I must’ve asked about 15 people to take the quiz and just a few of them popped up with blank submissions. Fast forward a little bit and I’ve narrowed it down to the form. Something was going wrong when some users encountered the form. But what?

One girl kept taking the quiz and filing blank entries so finally I asked her to send me a screenshot of the form before she hit submit. About two seconds after I saw the form I knew what the problem was.

Photo of the form without the phone number filled out

She didn’t fill out the phone number field. If (or rather, when) the user didn’t fill in the phone number field the form was spazzing and not collecting any of the lead information.

Lesson Learned.

There is a slim chance I would have figured that out on my own. I’m always testing the quizzes, taking them correctly and seeing what happens. What I learned last week was how important it is to test incorrectly. Try to break it, put in invalid answers and see how the piece responds. This was an extremely valuable lesson for my team, especially as we continue to grow and develop in this area of work.

Moving forward, when we are testing these kinds of assets, we will actually try to break them. We’ll continue to test them correctly, but in addition to that, we’ll take the quiz incorrectly to see how it responds. Human error is something we need to account for.

While this was definitely an issue, it was also a fantastic learning experience. We were able to identify the problem and make the necessary adjustments to get the quiz back up and running.

Bonus Takeaway.

We learned something else too. When I originally looked at the back end, I could see that roughly 300 users had taken the quiz. Scattered throughout the blank entries were some completed ones. Because I now knew what the issue was, I could conclude that people did want to include their phone numbers to get their result.

Originally, phone number was something the client wanted to incorporate. After experiencing the malfunction and seeing how many users were unwilling to add it, we suggested they remove it from the quiz all together. Another great learning experience for our client!



Leave a Reply

Your email address will not be published. Required fields are marked *