What you've just stumbled upon is the same puzzle that stumped many astrophysicists in the early 20th century. The "100 million degrees" figure you quote is indeed the temperature at which a significant portion of the plasma can undergo fusion reactions by overcoming the classical Coulomb barrier. But we know the Sun's core fuses hydrogen, so why is it colder than it should be? The answer has to do with density and quantum tunneling.
It turns out that confining plasma heated to millions of degrees is quite difficult. As such, in terrestrial fusion devices, we can only confine a small amount of low-density plasma at once, and so, in order to do anything meaningful, we have to heat it until most of it is fusing.
The Sun, however, has no trouble confining plasma; it does so effortlessly, with gravity. As such, it doesn't particularly care if most of the plasma is fusing, because there's no shortage of it, after all, and what's there is at very high density. In order to keep itself burning, only a small portion of the plasma needs to be at the right energy for fusion. Since, at any temperature, you'll always have a high-energy tail to your probability distribution for particle kinetic energies, it stands to reason that, even at a cooler temperature, there might be enough plasma fusing to counterbalance gravitational contraction.
But it turns out that if you actually examine the tail of the Maxwell-Boltzmann distribution at 15 million degrees, there still isn't enough stuff at a high enough energy to overcome the classical Coulomb barrier. It was at this point that astrophysicists realized that you don't actually have to overcome the classical Coulomb barrier; you could just simply quantum-tunnel through the last bit of it. In any single collision, this only rarely happens, but the density at the core of the Sun is high enough that it makes up for the deficit and explains how the Sun is able to hold itself up at such a low temperature.
Not exactly what you’re looking for?