I am curious on how some of you adopt data-driven approach to your startups.
Recently, i did an experiment with simple AB testing tool like A/Bingo:
http://www.bingocardcreator.com/abingo/ Through that, i found that majority of our traffics are search engine's spider.
Example: instead of 100 visitors with 10% conversion rate (10 users), it become 10000 visitors and 0.001%. 9900 "visitors" are spiders.
How do you handle that? Any tools you would recommend?
It would be great if you have any reference to blog post or article that explain in detail on how to adopt data-driven approach to startups.
PS. Have been reading heaps about Dave McClure AARRR, Eric Ries, Andrew Chen, Mix Panel etc. I am looking for "more specific, with real-world scenario" kind of example.
Thank you.
I think robots.txt works well with "genuine" bots, and evil bots can be handled using htaccess rules.
See robots.txt examples from some top sites -
Use a web analytics tool that's Javascript-based -- those aren't triggered by search bots.
Beyond that, you'll have to discard all results from search engines. Maybe Patrick can help make that change to A/Bingo.
P.S. +1 on the robots.txt
Here's what i found recently about adopting data-driven method. it helps articulate the concept a bit: