From the information I have been able to find so far,
<noindex> is supposed to achieve this, making a single section of a page hidden from search engine spiders. But then it also seems this is not obeyed by many browsers - so if that is the case, what markup should be used instead of / in addition to it?
There is no way to stop crawlers from indexing anything, it's up to their author to decide what the crawlers would do. The rule-obeying ones, like Yahoo Slurp, Googlebot, etc. they each have their own rule, as you've already discovered, but it's still up to them whether to completely obey the rules, or not - say you set
robots-nocontent but that part is still indexed and put in some other place, maybe for checks for spam, illegal material, malware, etc.
And that's just for the "good" ones, there's no telling what the bad ones would do. So think of all the noindex stuff as a set of guidelines, not a set of strict rules.
And the only thing that works for sure: if you have sensitive data, or you simply don't want something indexed - don't make it publicly available.