I have done pretty extensive testing on my 200A bosch small frame alternator. It can supply 120-150A at hot idle. Since about 30A of that is for my engine systems, I have 90-120A available at hot normal idle. The alternator typically runs between 13.8-14.1V. As it heats up the regulator drops the voltage, and running full tilt for 30 minutes in hot weather, the current will drop (for battery charging applications). For constant load applications (cooking, toaster oven, etc), the voltage will drop. If I exceed the alternators capacity with the inverter the current will stay mostly stable at the hot idle level, but the voltage will drop until the starter battery beings to supply the balance of the watts.
An alternators output current limit is set by the max field current that the regulator will supply (for specific RPM). This multiplied by the alternators output voltage is the max watts. If the balance load would create a current greater than the current limit, the result is the output voltage sags, but the current remains the same. So the thermal load (power dissipated inside the alternator) stays the same or actually declines. This same method is used to prevent alternator overheating by smart regulators, which reduce the field current as temperature increases. This is pretty standard behavior on any common vehicle alternator made in the last 15 years. Engine compartments can be cramped, and idling in hot weather with high loads from AC and electronics not unusual.
Bumping to a high idle helps by increasing cooling, and increasing available current. My 200A unit will supply 200A in cool weather at about 2k engine rpm. In hot weather its closer to 180A. I have not tested in extreme weather, say 120F yet, but I would expect something around 150A to the the max available current. My alternator is located fairly low in the engine compartment, so its cooling air flow is probably within 20F of ambient when moving, and 40F of ambient idling.