Let E be a finite set, PE the family of its subsets, and f a mapping from PE to the set of non-negative reals, such that for any two disjoint subsets A,B of E, f(A∪B)=f(A)+f(B). Prove that there exists a subset F of E such that if with each A⊂E, we associate a subset A′ consisting of elements of A that are not in F, then f(A)=f(A′) and f(A) is zero if and only if A is a subset of F. combinatorics unsolvedcombinatorics