You don't think it's just a cheap oscillator? I guess it would have to be pretty freaking crappy to lose minutes over a few days. I was in charge of the spacecraft clock on the NASA MESSENGER mission. We had this thing called the OCXO - oven controlled crystal oscillator. It was our "precise" oscillator and normally drove the clock. We also had backup ones that were "coarse" and those drifted a few secs over a few weeks time at the worst. The OCXO was amazing. It drifted a few billionths of a sec per day, if I remember right. It was primarily needed to label laser altimeter data taken of the Mercury surface. It was so precise, it could measure the relativistic time dilation caused be changes in velocity and gravity! It literally enabled us to see time slow down and speed up. That was pretty damn cool. The OCXO engineer & I accidentally discovered it had that capability. We thought that we were seeing some systematic error, and it turned out we were seeing relativity!
Heh, yeah... I have a bit of experience with oscillators.

I build and use lots of hardware with Rubidium and OCXO oscillators for very high time and frequency accuracy... as well as TCXOs and cheap/standard XOs. Oscillator specs are one of the first things I look at for most projects. If they used just about any crystal, I can't imagine that's the problem.
Generally, an RTC is driven by a 32.768KHz crystal, with somewhere around 20ppm (parts per million) accuracy. This clock has error that's WAY outside of that spec. And since it's parts per million, the frequency of the crystal doesn't matter (i.e. a 10MHz crystal off by 20ppm will keep the same time accuracy as a 32KHz off by 20ppm). The easier way to think about it is that it'll be off by at most 20 seconds over 1 million seconds (~11.5 days).
I guess it's possible that they went really cheap and rather than using a crystal, instead used an MCUs internal RC oscillator (that'd be a very amateur move for someone making something intended to keep time). Those are usually spec'd to around 1-5%, or sometimes the really low power ones, much worse. In that case, being off by even 1% would give HORRIBLE accuracy (up to 14 minutes per day). I can't say my clock was ever that bad. Like most oscillators, they're VERY sensitive to temperature, so they'd run quite differently depending on temperature.
Most plug-in clocks simply use the line frequency to keep time. Line frequency is kept to pretty good long-term specs, though that's why you can't usually use a US clock abroad, even if you have the voltage converter (the clock will expect 60Hz but only get 50Hz and run slow). Of course this clock uses a wall wart to supply DC power, so that's not possible.
DogP