Relational information systems, systems that can be represented by tables of finite states, are commonly used in many areas such as logic circuits, finite-state machines, and relational databases. Decomposition is a natural method of handling complex systems and removing redundancies. It splits a table into a network of smaller, simpler, and interrelated new tables. In order to preserve the original features of the system, any valid decomposition must be lossless. Commutative partitions play an important role in the decomposition. The commutative property is essentially a general algebraic formulation of independency of two partitions. We express the interdependency of two commutative partitions by a bipartite graph, and classify the hierarchical independency structures by the topological property of bipartite graphs. In particular, we show that two partitions are decomposable, the strongest version of dependency, if and only if the associate bipartite graph is uniform. We also adopt Shannon's entropy to quantify the amount of information contained in each partition, and formulate the information-lossless decomposition by the entropy conservation identity. Under the assumption of running intersection property, we show that the general formulation of information-lossless decomposition of relational systems is given by the entropy inclusion-exclusion equality. Applications of these formulations to Boolean logic circuits and relational databases are presented to manifest the information-lossless decomposition processes.