When discussing Algorithmic Information Theory, I credit Chaitin, largely because I am most familiar with his work.
In more detail:
From what I gather about AIT, Solomonoff did the earliest work, Chaitin has done the most work, and Kolmogorov is over-credited: he's a great mathematician, but he apparently contributed little to AIT (seeing it as relatively insignificant foundations of probability) and attaching his name to the field is an egregious example of the Matthew Effect.
Notably, Li and Vitanyi, in "An Introduction to Kolmogorov Complexity" (p.84), explain their use of Kolmogorov's name in "Kolmogorov complexity" as an example of the Matthew Effect. This seems to be a misunderstanding: as I understand it, the Matthew Effect is seen as a bad thing: a minor contributor over-credited for their seniority.
Chaitin gives a history of AIT in The Unknowable, chapter 6; in it, he credits Solomonoff with the insight that algorithmic complexity quantifies Occam's razor (but Solomonoff doesn't see that it allows one to define randomness). On Kolmogorov, Chaitin writes: "As far as I know, Kolmogorov only publishes 3 or 4 pages on program-size complexity".
Chaitin is a figure of some controversy in math. Notably:
Thus, I feel a need to explain my respect and praise for him.