I just happened to come across this while I was working on Thermal Systems homework, so I thought I would incorporate it. Here is a quick graph I worked up of compressor inlet temperatures vs delta T (heat added to air) for three given inlet temperatures (50, 75, and 120 degrees F; 50 and 75 would likely be CAI depending on climate, and 120 is a likely under hood temp also depending on climate) and two different pressure ratios (2.36 and 1.82; 20 and 12 psi boost respectively, at sea level)
BTW: This is my first post here, and first post in general for a long while, sorry if image gets screwed up.
As you can see, the energy (heat) that gets added to the air is not linearly but exponentially proportional to the compressor inlet temperature. This difference becomes more drastic as boost pressure is increased and/or compressor efficiency is decreased (these assume 70% efficiency which is being generous). Now, the effect on power I'm not sure of, I know I've done the calculations before and it was at least somewhat significant, if I have time, I'll try to do those as well. This also does not include effects of intercooling, but if I remember correctly, the temperature drop through the intercooler is linearly proportional to the intercooler inlet temp, meaning it will not make up for additional delta T. But don't quote me on that one just yet.
Long story short, hot air intake (mounted in engine bay) bad, cold air intake (as in cold air being induced not necessarily a "CAI") good.
Oh, and to help explain what I think Konkordmusk was trying to say: Hotter air is less dense, therefore the compressor wheel must do more work on the air to compress it to the desired pressure (somewhat proportional to density). The more work the compressor must do, the more heat it will impart on the working fluid due to no process being 100% efficient.
Thanks for your time, I'm gonna go get some more coke so I can finish all this damn homework/projects now.
Edit: Y axis should say (T out - T in)