Simply having a temperature output from one of these IC thermometers is simple and if you let the whole setup sit still for enough time so you know everything is having the same temperature then you can trust your measurement to be correct. Just make sure the transistor and the thermometer are close and share as much copper as possible and that you give them time enough to settle ( we're talking minutes here, but I really don't do germanium transistors... )
I like heat and measuring it. I've worked with scientific equipment for heat measurement which had lots of those one-chip termometers ( I believe it was the dallas semiconductor ones...) with a digital output. For the equipment to work at a good level they had to know the temperature at various points on the circuit board to cancel out ambient temperature "noise". Measuring heat can easily take off into a nightmare of complexity... trust me on this one!

Another way to go about it is to give the transistors the same known temperature. Dip the top of them in icewater...
...knowing how much the temperature is affecting things to offset the output
Ah, yes! And this is where the complexity starts to take off... But still we're talking a limited temperature range... the real unlinearity stuff start to get to work once the transistor is getting warm and I dont know if you need to let it get warm to measure it? Because then you would have a real problem. With a constant current and the transistors given a temperature reference ( ice bath ) you could get the setup ready for measures in just 10 minutes...

-yikes!
The idea of measuring temperature is really good thinking but in reality I don't think it would do you no good... It's just to many variables involved.
Don't go there...

Just getting a temperature reading is easy!
It will probably do you no good!
Just finding out
if it is doing you good will be a quest into a jungle with lurking dangers and no map...