I was going to make an animated gif of the IntroCave watermark exploding, but searching for stock images is waaaaaay easier.
The Great Watermark Experiment of 2019
UPDATE: this round of testing didn't do that great, but I'll keep tinkering to figure out a sustainable way to bring free videos back.
I just turned off the watermarks on all of IntroCave's render servers. It took me about 2 minutes.
From looking at the search console and Google Analytics, I know a lot of people are looking for an intro maker with no watermark. I've had a task on my TODO list for over a year to run this as an experiment. It would probably have taken a day or two of backend code to set it up so new users get cookied into either a "watermark" or "no watermark" test group, at which point I could track the data through the preview-to-purchase funnel and see if having watermarks increases conversion rates. It's a totally fine experiment to run... but I'm just not sure how many people buy HD intro videos just to get rid of the watermark.
The preview videos at IntroCave are extremely low quality. They're rendered at 15fps and 480x270 resolution (480p would be 720x480, which is about 3x more pixels and is not even HD). There's a reason for this: SPEED. Render time is directly proportional to the conversion rate. The longer people have to wait to see a preview video, the less likely they are to purchase a video.
When I think about my target customers, it's people who want to grow their channel. Whether it's a hobby or a business, they view their videos as a serious endeavor and want to increase their production values. If someone is willing to slap a 480x270 intro on their videos... they're probably not customers anyway. That's okay!
I do expect some people will download and use the free previews without the watermark. It might be naive on my part, but I also don't think this is going to have much impact at all on how many videos I sell.
If I'm wrong, it'll take about two minutes to re-enable the watermarks. That's a heck of a lot better than spending a few days setting up an A/B test!
I've been slicing and dicing the data a bunch of different ways, but the data seems to say that having only a single price ($10) performs worse than having two price options ($5/$10). It looks like it performs worse by somewhere in the neighborhood of 10-20%. Because I changed the pricing model AND the checkout page at the same time, I don't have a way to tell whether this drop in price is more attributable to one or the other (WHOOPS). I do think it's better for the product, so I'm not going to rush out and revert all the code. I'll keep an eye on it (as I always do anyway) and think about tweaks I could make to bring the conversion rate back up.