I know this is an old topic, but I figure I'd explain the reasons behind this, as I've had to deal with this problem while coding microcontrollers.
Nothing in the world of electronics or software is ever truely random.
Random numbers are generated using an algorithm, which takes certain inputs as a 'seed', then passes that seed through the algorithm to generate a 'random' number.
What gets used as that seed depends on who wrote the algorithm, and if there is the possibility to set the seed to something other than default (or if the coder has actually bothered to set the seed to something more random). If you back to the days of the orginal Apple Mac's, you could take the same random generating programme and ran it on a bunch of identical Macs, then you'd get the same sequence of numbers on all of them. IIRC the old Mac algorithm copied a set memory location as the initial seed, so if all computers loaded the same software, the initial seed would always be the same.
Modern basic random generator algorithms are better, however you can still end up with the problem of a fixed sequence of numbers, unless you use something a bit more random to seed the generator. Personally, on a phone or computer, I'd use some basic algorithm that took the date/time to generate a suitable seed to create something a bit more random.