Hello, this is something that I really struggled with when I first entered the industry after school, and the engineer I spoke to did a really good job of helping me to get out of the mental rut I found myself in. Hope this helps someone else out there!

So, in school, I came to understand Ohm's Law, and that current is essentially the direct result of a voltage across a resistor of some kind. I also understood basic transformer theory, and that with power being equal on both sides, voltage is stepped up or down based on the turns ratio(and since power is equal, current will go the opposite direction of voltage).

I had never been exposed to a CT until I worked for a transformer manufacturer as a tester. The new hires got to have some one on one theory discussions with the engineers, which I personally really found helpful. Everyone kept telling me that the current is constant, and that voltage doesn't matter on the secondary(essentially, it would put out its 5A no matter what it took). Now, in my mind, this simply did NOT make sense! "How could there be a current with no voltage???" After much back and forth with the engineer, he helped explain it from a completely different perspective.

Essentially, he told me that a CT is no different than any other transformer, and that it essentially just has a super high ratio. He then told me to think of a transformer like a power supply that you would model in school. When you model power supplies, depending on how you want to look at it, there is an upper limit on voltage, current, and power. I prefer using a Thevenin Equivalent model, since if you want to get technical, current sources are a bit of an abstraction(I'll get to that, since the CT essentially is a current source). In such a model, voltage is fixed, and there is an upper limit on the current it can produce, also limiting the power it can supply. A model with 120V in series with 12 Ohms will never be able to exceed 10A of current, and that's if you literally short out the supply.

A CT essentially acts as a power supply with a VERY high voltage, but rather low limit of being able to supply current. The way it was explained to me is that while it acts as a normal transformer, it simply gets completely loaded down when you put any load on it. So, if it stepped up to say, 10kV, and you short it out with a 1 Ohm wire, then even though Ohm's law says it should have 10kA of current, because it has that limit of what it can actually supply, the current drops to 5A, and the voltage drops accordingly to 5V in that instant(keeping Ohm's Law valid).

While you always need to make sure never to open up a CT, in theory, the only reason for that is a lack of clearance between the secondary terminals. Using easy numbers(I don't recall the actual voltage you could see), if it stepped it up to 10KV, and the air between the terminals has 1 Giga-Ohm of resistance, then even though the current wouldn't be a full 5A, you could still expect 10 micro-Amps of current, which would be seen as a clear arc. In theory, if you could take the terminals and have adequate dielectrics between them, the CT would act as a simple step up transformer, and you could measure the 10kV across the terminals without any measurable current.

So, the abstraction of current sources I mentioned? Well, in summary, it is just a very high voltage source that has a limit on the current it can provide, loading the voltage down to wherever it needs to be. the voltage is high enough that the internal resistance barely factors into the math, until you reach a very high resistance load, at which point you basically start getting a voltage divider, and the current source is overly burdened. Hope that someone out there struggling with the concept of CTs found this helpful!