My colleagues talked about this topic during lunch break, but I don’t know much about it so it was difficult for me to join the discussion. I would like to learn more about it and hope that someone can help me understand the basic structure of a USB Connector.
Well first off, there are all sorts of USB connectors: USB-A, USB-B (and Mini-B and Micro-B and the relatively rare Micro-AB), and USB-C. Then the USB-A and a few of the USB-B variants are different depending on whether they support USB 2.0 or 3.0. But I'll keep the information somewhat basic since you asked the question in a somewhat basic fashion.
USB uses a master/slave architecture where one device is the host and the other device is the peripheral, unlike FireWire that used a peer-to-peer architecture. Before USB-C arrived, you probably noticed that USB cables had different connector styles on each end. The USB spec indicated that devices that were meant to be hosts would have a USB-A connector and devices that were meant to be peripherals would have some version of the USB-B connector. Some smartphones that could operate either as peripherals (when attached to a laptop, for example) or as hosts (when a memory card reader was attached to them) used the USB Micro-AB connector so that their operation would change depending on which type of cable was plugged into them.
Prior to USB 3.0, the USB connector itself used 4 pins: positive and negative Data pins, a power pin, and an electrical grounding pin. When USB 3.0 arrived, that was expanded to 9 pins. Those same 4 were kept for backward compatibility, and the new pins were two pairs of positive and negative SuperSpeed Data pins plus another electrical grounding pin. Having two pairs of Data pins meant that unlike USB 2.0, which could only send data in one direction at a time, which is called "half duplex", USB 3.0 could send data in both directions simultaneously, which is called "full duplex", and hence whereas USB 2.0 maxed out at 480 Mbps, USB 3.0 maxed out at 5 Gbps in each direction. USB 3.1 Gen 2 subsequently raised that to 10 Gbps, although it's relatively rare to find that standard supported on peripherals that use USB-A and USB-B connectors. Most systems adopted that as part of implementing USB-C connectors.
USB-C is significantly more complicated. First of all, it has two rows of pins arranged in a mirror fashion (Google "USB-C pinout") so that the connector is reversible, unlike all previous USB connectors. Second, it uses the same connector on each end of the cable because USB-C specifies the same connector regardless of whether the device is meant to be a host, peripheral, or either one. In terms of capabilities, I wrote a fairly long post about USB-C's capabilities in which I go into a bit of detail about how the connector's pins work here. There are a few other pins in the connector that I don't address, but their functions are relatively minor -- other than power, where USB-C's capabilities were significantly enhanced since the introduction of the USB Power Delivery spec allowed up to 100W to be carried over a USB-C cable. It also allowed power to flow in either direction, whereas with previous USB versions, power could only ever flow from host to peripheral.