If it's only 'thought', not a real thought, you will have more trouble keeping it from crashing than with anything else. Digital AI is only as good as language it's written in, since it's no different from a very, very big 'choose your adventure' book, and human language capacity will always be inferior to their full brain capacity.
But, on the practical side. You can't really 'isolate' the system, because you need the terminals, right? It's like caging a human and putting monkeys in charge of keeping him locked. Even if the human can't physically get out (in your cruelty, you, the monkey scientist, have created your human slave without arms or legs), he can easily control his surroundings via making all the monkeys dance to whatever tune he wants them to.
@khade: Surely this is an option too. If you want a powerful ally, and you're afraid of altering yourself... You only have to cede the real leadership (even if you keep being a formal leader, you're no longer The Brain... And you can't even say if your new ally is loyal, because it outsmarts you by far...).
However (and this is to DoktorV as well), if you've created something with only 2x or 3x human capacity, there is no problem - a well-organized biological network, which means, a healthy, efficient organization made of living humans, can effectively, as a whole, outsmart and control it. The real problems begin if your creation is 10 or 100x smarter...
Back to DoktorV... If you're cruel enough to create a sapient being without capacity of physical action, you can as well throw away all the metal and use flesh. A network of few thousand human brains, kept in a vat, will give you high cogitation capacity, although limited in tasks (as they need to be sufficiently lobotomized or drugged). This also suggests, that if you created a being of metal similarly shackled, its also will limit the scope of its tasks to sub-conscious.