I have an algorithm in code working for a string permutation. For example, for the character set of 'A', 'A', 'B', 'C', it generates 12 permutations of the string which is equivalent to 4! / 2!. When I allow repetitions within that algorithm, it produces 256 for the following character set: 'A', 'D', 'B', 'C' which is equivalent to 4^4.
Finally, I can also control the length. For example for 'A', 'B', 'C', 'D', 'E', 'F' character set, it can generate 6! / (6-2)! permutations.
However, I am having problems pre-determining how many permutations will be generated when the string length of characters is less than the count of characters within the character set and the character set contains duplicates. For example:
- The character set:
'A', 'A', 'C', 'D', 'E', 'F' - Length:
3
I can run the algorithm and see that the result is 72 with the below permutations:
AAC, AAD, AAE, AAF, ACA, ACD, ACE, ACF, ADA, ADC, ADE, ADF, AEA, AEC, AED, AEF, AFA, AFC, AFD, AFE, CAA, CAD, CAE, CAF, CDA, CDE, CDF, CEA, CED, CEF, CFA, CFD, CFE, DAA, DAC, DAE, DAF, DCA, DCE, DCF, DEA, DEC, DEF, DFA, DFC, DFE, EAA, EAC, EAD, EAF, ECA, ECD, ECF, EDA, EDC, EDF, EFA, EFC, EFD, FAA, FAC, FAD, FAE, FCA, FCD, FCE, FDA, FDC, FDE, FEA, FEC, FED
Does anyone know what the formula is here to reach to 72?