Proof: For any integer $n$ with $n ge 1$, the number of permutations of a set with $n$ elements is $n!$.
$begingroup$
I am trying to use mathematical induction to prove the following theorem:
For any integer $n$ with $n ge 1$, the number of permutations of a set with $n$ elements is $n!$.
Proof
Let $P(n)$ be the above statement.
Take the set of elements ${ x_1, x_2, dots, x_n mid n ge 1 }$.
$P(1)$ holds because the number of permutations of $1$ element is size $1$ and $1! = 1$.
Now assume that $P(n)$ is true for some $n = m ge 1$.
$P(m + 1)$ means that we have the set ${ x_1, x_2, dots, x_{m + 1} }.$
I'm not sure how to proceed from here. I was thinking of using the multiplication rule somehow, but I've been unable to make progress on this.
I've also been unable to find any proofs for this theorem online.
I would greatly appreciate it if people could please help me prove this.
EDIT (Completed Proof):
Let $P(n)$ be the above statement.
Take the set $X = { x_1, x_2, dots, x_n mid n ge 1 }$.
$P(1)$ holds because the number of permutations of $1$ element is size $1$ and $1! = 1$.
Now assume that $P(n)$ is true for some $n = m ge 1$.
And assume we have the sets $X = { x_1, x_2, dots, x_m }$ and $X' = { x_{m + 1} }$.
Let task $T$ represent tasks $T_1, T_2, dots, T_m$, where task $T_k, k = 1, 2, dots, m$, represents the task where the $k$th element of the set $X$ is fixed and every permutation of the resultant set configuration is calculated.
Every time we fix one element and find all permutations of the resultant set, that leaves one set configuration that the next set cannot have since it would be identical to one of the permutations of the previous configuration. This is what we assumed to be true for a set with $m$ elements.
Let task $T_{m + 1}$ be the task where we take the set $X^* = X cup X'$, fix the element $x_{m + 1}$, and calculate all permutations of the resultant set configuration. Since there are $m + 1$ elements in the set $X^*$, there are $m + 1$ ways to fix $x_{m + 1}$ ($m + 1$ set configurations) and calculate all permutations. Therefore, according to the multiplication rule, there are $(m!)(m + 1) = (m + 1)!$ ways to perform tasks $T$ and $T_{m + 1}$. $$tag*{$blacksquare$}$$
I would greatly appreciate it if people could please review the complete proof and provide feedback as to its correctness.
combinatorics permutations induction proof-explanation
$endgroup$
|
show 4 more comments
$begingroup$
I am trying to use mathematical induction to prove the following theorem:
For any integer $n$ with $n ge 1$, the number of permutations of a set with $n$ elements is $n!$.
Proof
Let $P(n)$ be the above statement.
Take the set of elements ${ x_1, x_2, dots, x_n mid n ge 1 }$.
$P(1)$ holds because the number of permutations of $1$ element is size $1$ and $1! = 1$.
Now assume that $P(n)$ is true for some $n = m ge 1$.
$P(m + 1)$ means that we have the set ${ x_1, x_2, dots, x_{m + 1} }.$
I'm not sure how to proceed from here. I was thinking of using the multiplication rule somehow, but I've been unable to make progress on this.
I've also been unable to find any proofs for this theorem online.
I would greatly appreciate it if people could please help me prove this.
EDIT (Completed Proof):
Let $P(n)$ be the above statement.
Take the set $X = { x_1, x_2, dots, x_n mid n ge 1 }$.
$P(1)$ holds because the number of permutations of $1$ element is size $1$ and $1! = 1$.
Now assume that $P(n)$ is true for some $n = m ge 1$.
And assume we have the sets $X = { x_1, x_2, dots, x_m }$ and $X' = { x_{m + 1} }$.
Let task $T$ represent tasks $T_1, T_2, dots, T_m$, where task $T_k, k = 1, 2, dots, m$, represents the task where the $k$th element of the set $X$ is fixed and every permutation of the resultant set configuration is calculated.
Every time we fix one element and find all permutations of the resultant set, that leaves one set configuration that the next set cannot have since it would be identical to one of the permutations of the previous configuration. This is what we assumed to be true for a set with $m$ elements.
Let task $T_{m + 1}$ be the task where we take the set $X^* = X cup X'$, fix the element $x_{m + 1}$, and calculate all permutations of the resultant set configuration. Since there are $m + 1$ elements in the set $X^*$, there are $m + 1$ ways to fix $x_{m + 1}$ ($m + 1$ set configurations) and calculate all permutations. Therefore, according to the multiplication rule, there are $(m!)(m + 1) = (m + 1)!$ ways to perform tasks $T$ and $T_{m + 1}$. $$tag*{$blacksquare$}$$
I would greatly appreciate it if people could please review the complete proof and provide feedback as to its correctness.
combinatorics permutations induction proof-explanation
$endgroup$
1
$begingroup$
I think you should change the 4th line to "Now assume that $P(m)$ is true for some $mgeq 1$. (The word some is important because you don't assume for all $m$, also it's greater or equal 1 not 2, you still don't know about $m=2$).
$endgroup$
– Yanko
Dec 28 '18 at 16:21
1
$begingroup$
Careful with your notation: you use $P$ to denote the statement you want to prove, but then you write $P(1)=1!$
$endgroup$
– Sambo
Dec 28 '18 at 16:23
$begingroup$
@Yanko Thanks for the comment. But we already know that it’s true for $m =1$, don’t we?
$endgroup$
– The Pointer
Dec 28 '18 at 16:27
$begingroup$
@ThePointer yes that's why we can assume the claim holds for some $mgeq 1$. (in other words you also need to prove the implication $P(1)implies P(2)$.)
$endgroup$
– Yanko
Dec 28 '18 at 16:29
1
$begingroup$
I can also explain @Sambo 's comment. I believe he is talking about the second line. Instead of writing $P(1)=1!$ you should say "$P(1)$ holds because the number of permutations of $1$ element is of size $1$ and $1=1!$". His point is that $P(1)$ is a statement not a number.
$endgroup$
– Yanko
Dec 28 '18 at 16:31
|
show 4 more comments
$begingroup$
I am trying to use mathematical induction to prove the following theorem:
For any integer $n$ with $n ge 1$, the number of permutations of a set with $n$ elements is $n!$.
Proof
Let $P(n)$ be the above statement.
Take the set of elements ${ x_1, x_2, dots, x_n mid n ge 1 }$.
$P(1)$ holds because the number of permutations of $1$ element is size $1$ and $1! = 1$.
Now assume that $P(n)$ is true for some $n = m ge 1$.
$P(m + 1)$ means that we have the set ${ x_1, x_2, dots, x_{m + 1} }.$
I'm not sure how to proceed from here. I was thinking of using the multiplication rule somehow, but I've been unable to make progress on this.
I've also been unable to find any proofs for this theorem online.
I would greatly appreciate it if people could please help me prove this.
EDIT (Completed Proof):
Let $P(n)$ be the above statement.
Take the set $X = { x_1, x_2, dots, x_n mid n ge 1 }$.
$P(1)$ holds because the number of permutations of $1$ element is size $1$ and $1! = 1$.
Now assume that $P(n)$ is true for some $n = m ge 1$.
And assume we have the sets $X = { x_1, x_2, dots, x_m }$ and $X' = { x_{m + 1} }$.
Let task $T$ represent tasks $T_1, T_2, dots, T_m$, where task $T_k, k = 1, 2, dots, m$, represents the task where the $k$th element of the set $X$ is fixed and every permutation of the resultant set configuration is calculated.
Every time we fix one element and find all permutations of the resultant set, that leaves one set configuration that the next set cannot have since it would be identical to one of the permutations of the previous configuration. This is what we assumed to be true for a set with $m$ elements.
Let task $T_{m + 1}$ be the task where we take the set $X^* = X cup X'$, fix the element $x_{m + 1}$, and calculate all permutations of the resultant set configuration. Since there are $m + 1$ elements in the set $X^*$, there are $m + 1$ ways to fix $x_{m + 1}$ ($m + 1$ set configurations) and calculate all permutations. Therefore, according to the multiplication rule, there are $(m!)(m + 1) = (m + 1)!$ ways to perform tasks $T$ and $T_{m + 1}$. $$tag*{$blacksquare$}$$
I would greatly appreciate it if people could please review the complete proof and provide feedback as to its correctness.
combinatorics permutations induction proof-explanation
$endgroup$
I am trying to use mathematical induction to prove the following theorem:
For any integer $n$ with $n ge 1$, the number of permutations of a set with $n$ elements is $n!$.
Proof
Let $P(n)$ be the above statement.
Take the set of elements ${ x_1, x_2, dots, x_n mid n ge 1 }$.
$P(1)$ holds because the number of permutations of $1$ element is size $1$ and $1! = 1$.
Now assume that $P(n)$ is true for some $n = m ge 1$.
$P(m + 1)$ means that we have the set ${ x_1, x_2, dots, x_{m + 1} }.$
I'm not sure how to proceed from here. I was thinking of using the multiplication rule somehow, but I've been unable to make progress on this.
I've also been unable to find any proofs for this theorem online.
I would greatly appreciate it if people could please help me prove this.
EDIT (Completed Proof):
Let $P(n)$ be the above statement.
Take the set $X = { x_1, x_2, dots, x_n mid n ge 1 }$.
$P(1)$ holds because the number of permutations of $1$ element is size $1$ and $1! = 1$.
Now assume that $P(n)$ is true for some $n = m ge 1$.
And assume we have the sets $X = { x_1, x_2, dots, x_m }$ and $X' = { x_{m + 1} }$.
Let task $T$ represent tasks $T_1, T_2, dots, T_m$, where task $T_k, k = 1, 2, dots, m$, represents the task where the $k$th element of the set $X$ is fixed and every permutation of the resultant set configuration is calculated.
Every time we fix one element and find all permutations of the resultant set, that leaves one set configuration that the next set cannot have since it would be identical to one of the permutations of the previous configuration. This is what we assumed to be true for a set with $m$ elements.
Let task $T_{m + 1}$ be the task where we take the set $X^* = X cup X'$, fix the element $x_{m + 1}$, and calculate all permutations of the resultant set configuration. Since there are $m + 1$ elements in the set $X^*$, there are $m + 1$ ways to fix $x_{m + 1}$ ($m + 1$ set configurations) and calculate all permutations. Therefore, according to the multiplication rule, there are $(m!)(m + 1) = (m + 1)!$ ways to perform tasks $T$ and $T_{m + 1}$. $$tag*{$blacksquare$}$$
I would greatly appreciate it if people could please review the complete proof and provide feedback as to its correctness.
combinatorics permutations induction proof-explanation
combinatorics permutations induction proof-explanation
edited Dec 30 '18 at 5:39
The Pointer
asked Dec 28 '18 at 16:19
The PointerThe Pointer
2,51231641
2,51231641
1
$begingroup$
I think you should change the 4th line to "Now assume that $P(m)$ is true for some $mgeq 1$. (The word some is important because you don't assume for all $m$, also it's greater or equal 1 not 2, you still don't know about $m=2$).
$endgroup$
– Yanko
Dec 28 '18 at 16:21
1
$begingroup$
Careful with your notation: you use $P$ to denote the statement you want to prove, but then you write $P(1)=1!$
$endgroup$
– Sambo
Dec 28 '18 at 16:23
$begingroup$
@Yanko Thanks for the comment. But we already know that it’s true for $m =1$, don’t we?
$endgroup$
– The Pointer
Dec 28 '18 at 16:27
$begingroup$
@ThePointer yes that's why we can assume the claim holds for some $mgeq 1$. (in other words you also need to prove the implication $P(1)implies P(2)$.)
$endgroup$
– Yanko
Dec 28 '18 at 16:29
1
$begingroup$
I can also explain @Sambo 's comment. I believe he is talking about the second line. Instead of writing $P(1)=1!$ you should say "$P(1)$ holds because the number of permutations of $1$ element is of size $1$ and $1=1!$". His point is that $P(1)$ is a statement not a number.
$endgroup$
– Yanko
Dec 28 '18 at 16:31
|
show 4 more comments
1
$begingroup$
I think you should change the 4th line to "Now assume that $P(m)$ is true for some $mgeq 1$. (The word some is important because you don't assume for all $m$, also it's greater or equal 1 not 2, you still don't know about $m=2$).
$endgroup$
– Yanko
Dec 28 '18 at 16:21
1
$begingroup$
Careful with your notation: you use $P$ to denote the statement you want to prove, but then you write $P(1)=1!$
$endgroup$
– Sambo
Dec 28 '18 at 16:23
$begingroup$
@Yanko Thanks for the comment. But we already know that it’s true for $m =1$, don’t we?
$endgroup$
– The Pointer
Dec 28 '18 at 16:27
$begingroup$
@ThePointer yes that's why we can assume the claim holds for some $mgeq 1$. (in other words you also need to prove the implication $P(1)implies P(2)$.)
$endgroup$
– Yanko
Dec 28 '18 at 16:29
1
$begingroup$
I can also explain @Sambo 's comment. I believe he is talking about the second line. Instead of writing $P(1)=1!$ you should say "$P(1)$ holds because the number of permutations of $1$ element is of size $1$ and $1=1!$". His point is that $P(1)$ is a statement not a number.
$endgroup$
– Yanko
Dec 28 '18 at 16:31
1
1
$begingroup$
I think you should change the 4th line to "Now assume that $P(m)$ is true for some $mgeq 1$. (The word some is important because you don't assume for all $m$, also it's greater or equal 1 not 2, you still don't know about $m=2$).
$endgroup$
– Yanko
Dec 28 '18 at 16:21
$begingroup$
I think you should change the 4th line to "Now assume that $P(m)$ is true for some $mgeq 1$. (The word some is important because you don't assume for all $m$, also it's greater or equal 1 not 2, you still don't know about $m=2$).
$endgroup$
– Yanko
Dec 28 '18 at 16:21
1
1
$begingroup$
Careful with your notation: you use $P$ to denote the statement you want to prove, but then you write $P(1)=1!$
$endgroup$
– Sambo
Dec 28 '18 at 16:23
$begingroup$
Careful with your notation: you use $P$ to denote the statement you want to prove, but then you write $P(1)=1!$
$endgroup$
– Sambo
Dec 28 '18 at 16:23
$begingroup$
@Yanko Thanks for the comment. But we already know that it’s true for $m =1$, don’t we?
$endgroup$
– The Pointer
Dec 28 '18 at 16:27
$begingroup$
@Yanko Thanks for the comment. But we already know that it’s true for $m =1$, don’t we?
$endgroup$
– The Pointer
Dec 28 '18 at 16:27
$begingroup$
@ThePointer yes that's why we can assume the claim holds for some $mgeq 1$. (in other words you also need to prove the implication $P(1)implies P(2)$.)
$endgroup$
– Yanko
Dec 28 '18 at 16:29
$begingroup$
@ThePointer yes that's why we can assume the claim holds for some $mgeq 1$. (in other words you also need to prove the implication $P(1)implies P(2)$.)
$endgroup$
– Yanko
Dec 28 '18 at 16:29
1
1
$begingroup$
I can also explain @Sambo 's comment. I believe he is talking about the second line. Instead of writing $P(1)=1!$ you should say "$P(1)$ holds because the number of permutations of $1$ element is of size $1$ and $1=1!$". His point is that $P(1)$ is a statement not a number.
$endgroup$
– Yanko
Dec 28 '18 at 16:31
$begingroup$
I can also explain @Sambo 's comment. I believe he is talking about the second line. Instead of writing $P(1)=1!$ you should say "$P(1)$ holds because the number of permutations of $1$ element is of size $1$ and $1=1!$". His point is that $P(1)$ is a statement not a number.
$endgroup$
– Yanko
Dec 28 '18 at 16:31
|
show 4 more comments
7 Answers
7
active
oldest
votes
$begingroup$
Your proof seems correct, but (in my opinion) it's too elaborate. It obscures the combinatorial simplicity of the argument. I might write the following:
Clearly, there is only one (that is, $1!$) permutation on a set containing a single element.
Suppose now (for the purpose of induction) that there are $m!$ permutations of a set containing $m$ elements. We seek to show that there are $(m+1)!$ permutations of a set containing $m+1$ elements.
We can construct all permutations of the latter set as follows:
- Choose a permutation of the first $m$ elements.
- Insert the final element into the permutation.
By the inductive hypothesis, Step 1 can be completed in $m!$ ways. Step 2 can be completed in $m+1$ ways, since there are $m+1$ locations into which the final element may be inserted. Therefore, we have (by the multiplication principle) that the number of permutations of an $m+1$ element set is equal to $m! cdot (m+1)$, which is the desired $(m+1)!$.
$endgroup$
$begingroup$
Thanks for the answer, Austin. I'm having trouble accepting your reasoning as a proof. My problem with your reasoning is the same as that which I have with Siong's (although, it was my perception that Siong was just giving hints rather than a full proof): you conclude that we therefore have $m! (m + 1)$ permutations without justifying this fact (without justifying how you got $m! (m + 1)$), which seems to me to invalidate it being a proof? What my proof does is justify how we get $m! (m + 1)$ (by utilising the multiplication theorem). It is more elaborate, but that's because [...]
$endgroup$
– The Pointer
Dec 31 '18 at 10:33
$begingroup$
[...] it's more rigorous, right?
$endgroup$
– The Pointer
Dec 31 '18 at 10:33
2
$begingroup$
@ThePointer There are many good articles available discussing what constitutes a 'proof' in mathematics (and what level of formalism is appropriate in different situations). When we talk in general about proofs (such as the majority of those on this site), we are talking about a 'semi-formal' proof which is may be supported by 'obvious' facts/intuition and well established results rather than an actual formal proof which could be checked by a computer, say. A proof which is 'more formal' is not necessarily better or more valid. [...]
$endgroup$
– John Don
Dec 31 '18 at 16:37
1
$begingroup$
@ThePointer [...] Moreover, if you want to be strict about the level of formalism, then really you should have a set of facts (axioms) that can be assumed (e.g. you are using the rule of product) as an established fact. Without this it hard to know what you would consider "too rigorous" or "not rigorous enough". That is not to say that it is not worth being aware of how pieces of intuition can be formally justified (see this for example).
$endgroup$
– John Don
Dec 31 '18 at 16:40
1
$begingroup$
@ThePointer That being said though, accepting that "it can be done formally" is IMO unsatisfying. You should read a proofs textbook and write some fully formal proofs, so that most importantly you can confirm what is "obviously true" is actually true, and that you can accept "it can be done formally (but is tedious!)" more confidently.
$endgroup$
– palmpo
Jan 4 at 0:22
|
show 4 more comments
$begingroup$
When we are given $m+1$ items, we separate a special item from the other $m$ items. We sort the $m$ items and fix their order, that give us $m!$ options.
Now we have $m+1$ positions to choose to place our special items and none of them repeats.
Hence we have $m! cdot (m+1)= (m+1)!$
$endgroup$
$begingroup$
Thanks for the answer. I’m confused by this; can you please elaborate?
$endgroup$
– The Pointer
Dec 28 '18 at 16:50
$begingroup$
Suppose you have figure out the number of permutations for $5$ items is $5!$. Now, given the $6$th, you sort the $5$ items first, now you have $6$ options to place the $6$ items right? To visulize, let the configuration of the $5$ items be your fingers, you can see that you have $6$ places to choose for the new item.
$endgroup$
– Siong Thye Goh
Dec 28 '18 at 17:02
$begingroup$
Ok, thanks for the clarification. I have edited my post with a complete proof.
$endgroup$
– The Pointer
Dec 30 '18 at 5:41
add a comment |
$begingroup$
Consider all the $m!$ permutations of the $m$ first elements and insert the $m+1^{th}$ in all the possible places. As there are $m+1$ possible places per permutation, we get a total of $(m+1)m!$.
Remains to show that we didn't forget any new permutation nor repeat any.
by hypothesis, we started with all permutations of the first $m$; a permutation of the $m+1$ is perforce a permutation of the $m$ with the $m+1^{th}$ inserted somewhere. We tried all insertion points, nothing is missed.
all permutations of $m$ are distinct. All permutations obtained by inserting the new element in position $k$ are distinct from each other, and distinct for all $k$. Thus all new permutations are distinct.
$endgroup$
$begingroup$
For the OP's benefit: This is a high level summary of what I am trying to flesh out in my answer.. 'didn't forget any = surjectivity' and 'nor repeat any' = injectivity. (+1)
$endgroup$
– CopyPasteIt
Dec 31 '18 at 16:52
add a comment |
$begingroup$
Any rigorous proof that involves induction has to develop some rudimentary theory for the set of all permutations on a set. The fact that the result is so well known and can be directly proven using the rule of product does not mean that proof details can be glossed over.
Let $X$ be a finite set containing $n$ elements with symmetric group $text{Sym}(X)$, i.e. the set of all permutations of $X$. Let $x' notin X$ and set $X' = X cup {x'}$ so that there is a natural inclusion $text{Sym}(X) subset text{Sym}(X')$ - you extend any permutation $rho$ on $X$ to all of $X'$ by defining $rho(x') = x'$.
If we have a set of permutations, we can generate a larger set by taking all possible products (i.e. functional composition) of the permutations. Whenever we generate these 'algebraic closures' we throw in the identity mapping for good measure.
Proposition 1: If $text{Sym}(X)$ is generated by the set of all transpositions then the set $text{Sym}(X')$ is also generated by the set of transpositions.
Proof
Let $f in text{Sym}(X')$ and set $tau = left(,x' ;; f^{-1}(x'),right)$ and $g = f circ tau$. It is immediate that $g(x') = x'$ so that $g$ is a product of transpositions. But then $f = g circ tau^{-1}$ is also the composition of transpositions, since $tau^{-1} = tau$. $quad blacksquare$
Note: In the preceding proof, $tau$ can represent the identity. So we are abusing the notation used for a transposition, but this is convenience that we will continue to implicitly employ in what follows.
In general, the following equations hold true for transpositions:
begin{align}(T1) quad (ab)(ab)&=1_{Id}\
(T2) quad (ab)(cd)&=(cd)(ab)\
(T3) quad (ab)(ac)&=(bc)(ab)\
(T4) quad (ab)(bc)&=(bc)(ac)end{align}
If $f in text{Sym}(X')$ is a product of transpositions, locate the first one $tau$ from the left side (if any) that swaps $x'$ with another element and start 'pushing it' step-by-step to the far right using
$text{(T1)}$ thru $text{(T3)}$ . When using $text{(T2)}$ or $text{(T3)}$ progress is made as $tau$ is closer to the right and none of the transpositions on the left of $tau$ swap $x'$. If at any step $text{(T1)}$ comes into play, then the 'composition string' is reduced by $2$ transpositions (progress) and you have to start over.
When this pushing algorithm has ended, we can write
$tag 1 f =g circ left(,x' ;; x,right) , text{ with } (x in X) ;land; (g in text{Sym}(X))$.
Let $mathcal S$ be the collection of all transpositions in $text{Sym}(X')$ swapping $x'$ with another element. Again, as a convenience the identity is a member of $mathcal S$ and so $text{#}(mathcal S) = n + 1$.
Proposition 2: The mapping $left(g,tauright) mapsto g circ tau$ defines a bijective correspondence
$quad text{Sym}(X) times mathcal S equiv text{Sym}(X')$.
Proof
We've already demonstrated that the mapping is a surjection and the injectivity is an easy argument. $blacksquare$
As easy consequence of proposition 2 is that
$quad text{#}left(text{Sym}(X')right) =text{#} left( text{Sym}(X) times mathcal S right)= text{#} left( text{Sym}(X) right) times text{#} left(mathcal S right) = text{#} left( text{Sym}(X) right) times (n+1) $
The following can be derived using induction and the above theory:
Theorem 3: If $X$ has $n$ elements then $text{Sym}(X)$ contains $n!$ elements.
$endgroup$
$begingroup$
Thanks for the answer. The level of rigour presented in your proof is way beyond what a textbook of this level expects. But I take it that you agree that my proof is valid/correct (albeit, less rigorous than the one you have presented)?
$endgroup$
– The Pointer
Dec 31 '18 at 15:35
$begingroup$
@ThePointer I have difficulty understanding what a task is. But can you follow the 'gist' of my argument? It may appear formal but the idea is straightforward. Does your textbook present transpositions? I thought it would be helpful to prove that every permutation is a product of transposition using induction (usually one breaks down the cycles into transpositions).
$endgroup$
– CopyPasteIt
Dec 31 '18 at 15:42
$begingroup$
A "task" is as presented in the multiplication theorem: If there are $n_k$ ways to perform task $T_k$ for $k=1,dots,m$, then there are $n_1n_2dots n_m$ ways to perform all $m$ tasks.
$endgroup$
– The Pointer
Dec 31 '18 at 15:43
$begingroup$
And this is a discrete mathematics textbook (often used by computer scientists), so it doesn't introduce transpositions. It's a pretty basic-level textbook -- certainly not rigorous like textbooks for mathematicians are.
$endgroup$
– The Pointer
Dec 31 '18 at 15:44
1
$begingroup$
Yes, the textbook requests proof by induction and said it will "leave the proof to the author as an exercise", which is why I decided to do it. It actually presented the product rule in the same chapter (chapter on combinatorics), immediately before this theorem, so I thought it would perhaps be natural to use it in the proof.
$endgroup$
– The Pointer
Dec 31 '18 at 15:52
|
show 6 more comments
$begingroup$
This answer might be more appropriate for students studying discrete mathematics and interested in combinatorics and/or computer science.
One can regard permutations as 'scrambled words';
see Permutation: One line notation.
So with the ordered alphabet $mathcal A = {a,b,c,dots,z}$ here are the lists of scrambled permutation words:
Scrambling first 1 letters: $P_1 = {a}$
Scrambling first 2 letters: $P_2 = {ab,ba}$
Scrambling first 3 letters: $P_3 = {abc,acb,bac,cab,cba,bca}$
etc.
These ideas can be implemented in Python; here we create $P_3$:
>>> import itertools
>>> list(itertools.permutations(['a','b','c'],3))
[('a', 'b', 'c'), ('a', 'c', 'b'), ('b', 'a', 'c'), ('b', 'c', 'a'), ('c', 'a', 'b'), ('c', 'b', 'a')]
How many scrambled permutation words can be formed with the first $4$ letters, ${a,b,c,d}$?
There is a mapping from $P_4$ to $P_3$ defined by removing the letter $d$ from a word $pmb omega$ in $P_4$.
Examples: $abdc mapsto abc$; $;cdba mapsto cba$.
It is easy to see that $4$ words in $P_4$ get mapped to each word in $P_3$.
So the number of words in $P_4$ is equal to $4 times text{#}(P_3)$.
In general it can be shown that
$tag 1 text{#}(P_{n+1}) = (n+1) times text{#}(P_n)$
The above is all that is needed to get the $n!$ permutation count using induction.
$endgroup$
add a comment |
$begingroup$
Here's the basis of a proof I might do for this. I'm going to use a numerical argument to make my point clearer, but it will hold for a general $n$ too.
Suppose we assume that the statement holds true for $n=4$ and we use the letters $A,B,C,D$ as our four elements in the set.
One such listing might be $ABCD$.
Suppose then we wish to add a fifth element, $E$, to this set, whilst keeping the other four in order. We can do this in five ways, clearly:
$$EABCD, AEBCD, ABECD, ABCED, ABCDE$$
This will be the case for every permutation of the set of $4$, and will cover every permutation of the set of $5$, without crossovers (the second condition eliminates them)
Thusly, the permutations of a set of $5$ is $4!cdot 5 =5!$
Generalise this and you have a proof.
$endgroup$
add a comment |
$begingroup$
If $X$ is a set, then $S_X$ shall denote the set of all permutations of $X$ (that is, of all bijections from $X$ to $X$). You want to prove that every finite set $X$ satisfies $left|S_Xright| = left|Xright|!$. (Note that this is perfectly true when $X$ is empty.)
If you want a fully rigorous proof written up in all detail, then one place you can find it is Corollary 7.82 in my Notes on the combinatorial fundamentals of algebra, 10th of January 2019. This proof can be summarized as follows:
If $X$ is a set, and if $x$ and $y$ are two elements of $X$, then we let $t_{x, y}$ be the permutation of $X$ that swaps $x$ with $y$ while leaving all other elements of $X$ unchanged. (This is the identity map when $x = y$, and otherwise is a transposition.)
Now let $n$ be a nonnegative integer, and set $left[nright] := left{1,2,ldots,nright}$. Prove (Exercise 5.9 in op. cit.) that each permutation $sigma$ of $left[nright]$ can be written in the form $sigma = t_{1, i_1} circ t_{2, i_2} circ cdots circ t_{n, i_n}$ for a unique $left(i_1, i_2, ldots, i_nright) in left[1right] times left[2right] times cdots times left[nright]$. Roughly speaking, this simply means that $sigma$ can be turned into the identity permutation by first swapping $n$ with its image, then swapping $n-1$ with its image (under the resulting permutation), and so on, until all numbers $n, n-1, ldots, 1$ are in their proper places. I formalize this as an induction argument.
The above statement can be interpreted as a bijection between the set $left[1right] times left[2right] times cdots times left[nright]$ and the set $S_{left[nright]}$ of all permutations of $left[nright]$. The former set clearly has size $n!$; hence, so does the latter. Thus, the set $S_X$ of all permutations of any $n$-element set $X$ has size $n!$ as well (to prove this, fix some bijection between $X$ and $left[nright]$, and use it to construct a bijection from $S_X$ to $S_{left[nright]}$).
This is not the shortest proof. The post by @CopyPasteIt suggests a different argument, but muddles it up somewhat in my opinion, so here is a sketch of how I would do it:
Proceed by induction on $left|Xright|$. The induction base ($left|Xright| = 0$) is obvious. For the induction step, you need to show that if $X$ is a finite set and $x in X$ is arbitrary, then $left|S_Xright| = left|Xright| cdot left|S_{X setminus left{xright} }right|$.
So let us fix a finite set $X$ and some $x in X$. Let $W$ be the set of all permutations $tau in S_X$ satisfying $tauleft(xright) = x$. It is not hard to see that there is a bijection from $S_{X setminus left{xright} }$ to $W$. (Indeed, this follows from Proposition 5.58 (e) in op. cit., applied to $Y = X setminus left{xright}$; but you really should prove this yourself, as it is a straightforward formalization of a clear statement: A permutation of $X$ that fixes $x$ is "really" just a permutation of $X setminus left{xright}$. When you formalize the "really", you get a very natural bijection.) Hence, $left|Wright| = left|S_{X setminus left{xright} }right|$.
If $sigma in S_X$ is any permutation, then there is a unique $y in X$ satisfying $t_{x, y} circ sigma in W$ (namely, this $y$ is $sigmaleft(xright)$). Hence, each $sigma in S_X$ can be written as $t_{x, y} circ w$ for a unique pair $left(y, wright) in X times W$. In other words, the map $X times W to S_X, left(y, wright) mapsto t_{x, y} circ w$ is a bijection. Thus, $left|S_Xright| = left|X times Wright| = left|Xright| cdot left|Wright| = left|Xright| cdot left|S_{X setminus left{xright} }right|$ (since $left|Wright| = left|S_{X setminus left{xright} }right|$). This completes the induction step.
Things get somewhat easier once you get used to combinatorics and realize that the permutations of a finite $n$-element set $X$ are in bijection with the $n$-tuples $left(x_1, x_2, ldots, x_nright)$ of distinct elements of $X$. Indeed, confusingly, the latter $n$-tuples are often called permutations of $X$ as well, although there is in general no canonical bijection between these two kinds of permutations (see Remark 5.4 in op. cit.). This confusion set aside, it is not hard to see using some versions of the product law that the number of $n$-tuples $left(x_1, x_2, ldots, x_nright)$ of distinct elements of an $n$-element set $X$ is $n!$; in fact, more generally, for any nonnegative integer $m$, the number of $m$-tuples $left(x_1, x_2, ldots, x_mright)$ of distinct elements of an $n$-element set $X$ is $nleft(n-1right)left(n-2right)cdotsleft(n-m+1right)$. This gives the perhaps shortest proofs of your claim, having the advantage that you don't really have to deal with permutations (once you have shown that they are in bijection with these tuples).
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3055033%2fproof-for-any-integer-n-with-n-ge-1-the-number-of-permutations-of-a-set-w%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
7 Answers
7
active
oldest
votes
7 Answers
7
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Your proof seems correct, but (in my opinion) it's too elaborate. It obscures the combinatorial simplicity of the argument. I might write the following:
Clearly, there is only one (that is, $1!$) permutation on a set containing a single element.
Suppose now (for the purpose of induction) that there are $m!$ permutations of a set containing $m$ elements. We seek to show that there are $(m+1)!$ permutations of a set containing $m+1$ elements.
We can construct all permutations of the latter set as follows:
- Choose a permutation of the first $m$ elements.
- Insert the final element into the permutation.
By the inductive hypothesis, Step 1 can be completed in $m!$ ways. Step 2 can be completed in $m+1$ ways, since there are $m+1$ locations into which the final element may be inserted. Therefore, we have (by the multiplication principle) that the number of permutations of an $m+1$ element set is equal to $m! cdot (m+1)$, which is the desired $(m+1)!$.
$endgroup$
$begingroup$
Thanks for the answer, Austin. I'm having trouble accepting your reasoning as a proof. My problem with your reasoning is the same as that which I have with Siong's (although, it was my perception that Siong was just giving hints rather than a full proof): you conclude that we therefore have $m! (m + 1)$ permutations without justifying this fact (without justifying how you got $m! (m + 1)$), which seems to me to invalidate it being a proof? What my proof does is justify how we get $m! (m + 1)$ (by utilising the multiplication theorem). It is more elaborate, but that's because [...]
$endgroup$
– The Pointer
Dec 31 '18 at 10:33
$begingroup$
[...] it's more rigorous, right?
$endgroup$
– The Pointer
Dec 31 '18 at 10:33
2
$begingroup$
@ThePointer There are many good articles available discussing what constitutes a 'proof' in mathematics (and what level of formalism is appropriate in different situations). When we talk in general about proofs (such as the majority of those on this site), we are talking about a 'semi-formal' proof which is may be supported by 'obvious' facts/intuition and well established results rather than an actual formal proof which could be checked by a computer, say. A proof which is 'more formal' is not necessarily better or more valid. [...]
$endgroup$
– John Don
Dec 31 '18 at 16:37
1
$begingroup$
@ThePointer [...] Moreover, if you want to be strict about the level of formalism, then really you should have a set of facts (axioms) that can be assumed (e.g. you are using the rule of product) as an established fact. Without this it hard to know what you would consider "too rigorous" or "not rigorous enough". That is not to say that it is not worth being aware of how pieces of intuition can be formally justified (see this for example).
$endgroup$
– John Don
Dec 31 '18 at 16:40
1
$begingroup$
@ThePointer That being said though, accepting that "it can be done formally" is IMO unsatisfying. You should read a proofs textbook and write some fully formal proofs, so that most importantly you can confirm what is "obviously true" is actually true, and that you can accept "it can be done formally (but is tedious!)" more confidently.
$endgroup$
– palmpo
Jan 4 at 0:22
|
show 4 more comments
$begingroup$
Your proof seems correct, but (in my opinion) it's too elaborate. It obscures the combinatorial simplicity of the argument. I might write the following:
Clearly, there is only one (that is, $1!$) permutation on a set containing a single element.
Suppose now (for the purpose of induction) that there are $m!$ permutations of a set containing $m$ elements. We seek to show that there are $(m+1)!$ permutations of a set containing $m+1$ elements.
We can construct all permutations of the latter set as follows:
- Choose a permutation of the first $m$ elements.
- Insert the final element into the permutation.
By the inductive hypothesis, Step 1 can be completed in $m!$ ways. Step 2 can be completed in $m+1$ ways, since there are $m+1$ locations into which the final element may be inserted. Therefore, we have (by the multiplication principle) that the number of permutations of an $m+1$ element set is equal to $m! cdot (m+1)$, which is the desired $(m+1)!$.
$endgroup$
$begingroup$
Thanks for the answer, Austin. I'm having trouble accepting your reasoning as a proof. My problem with your reasoning is the same as that which I have with Siong's (although, it was my perception that Siong was just giving hints rather than a full proof): you conclude that we therefore have $m! (m + 1)$ permutations without justifying this fact (without justifying how you got $m! (m + 1)$), which seems to me to invalidate it being a proof? What my proof does is justify how we get $m! (m + 1)$ (by utilising the multiplication theorem). It is more elaborate, but that's because [...]
$endgroup$
– The Pointer
Dec 31 '18 at 10:33
$begingroup$
[...] it's more rigorous, right?
$endgroup$
– The Pointer
Dec 31 '18 at 10:33
2
$begingroup$
@ThePointer There are many good articles available discussing what constitutes a 'proof' in mathematics (and what level of formalism is appropriate in different situations). When we talk in general about proofs (such as the majority of those on this site), we are talking about a 'semi-formal' proof which is may be supported by 'obvious' facts/intuition and well established results rather than an actual formal proof which could be checked by a computer, say. A proof which is 'more formal' is not necessarily better or more valid. [...]
$endgroup$
– John Don
Dec 31 '18 at 16:37
1
$begingroup$
@ThePointer [...] Moreover, if you want to be strict about the level of formalism, then really you should have a set of facts (axioms) that can be assumed (e.g. you are using the rule of product) as an established fact. Without this it hard to know what you would consider "too rigorous" or "not rigorous enough". That is not to say that it is not worth being aware of how pieces of intuition can be formally justified (see this for example).
$endgroup$
– John Don
Dec 31 '18 at 16:40
1
$begingroup$
@ThePointer That being said though, accepting that "it can be done formally" is IMO unsatisfying. You should read a proofs textbook and write some fully formal proofs, so that most importantly you can confirm what is "obviously true" is actually true, and that you can accept "it can be done formally (but is tedious!)" more confidently.
$endgroup$
– palmpo
Jan 4 at 0:22
|
show 4 more comments
$begingroup$
Your proof seems correct, but (in my opinion) it's too elaborate. It obscures the combinatorial simplicity of the argument. I might write the following:
Clearly, there is only one (that is, $1!$) permutation on a set containing a single element.
Suppose now (for the purpose of induction) that there are $m!$ permutations of a set containing $m$ elements. We seek to show that there are $(m+1)!$ permutations of a set containing $m+1$ elements.
We can construct all permutations of the latter set as follows:
- Choose a permutation of the first $m$ elements.
- Insert the final element into the permutation.
By the inductive hypothesis, Step 1 can be completed in $m!$ ways. Step 2 can be completed in $m+1$ ways, since there are $m+1$ locations into which the final element may be inserted. Therefore, we have (by the multiplication principle) that the number of permutations of an $m+1$ element set is equal to $m! cdot (m+1)$, which is the desired $(m+1)!$.
$endgroup$
Your proof seems correct, but (in my opinion) it's too elaborate. It obscures the combinatorial simplicity of the argument. I might write the following:
Clearly, there is only one (that is, $1!$) permutation on a set containing a single element.
Suppose now (for the purpose of induction) that there are $m!$ permutations of a set containing $m$ elements. We seek to show that there are $(m+1)!$ permutations of a set containing $m+1$ elements.
We can construct all permutations of the latter set as follows:
- Choose a permutation of the first $m$ elements.
- Insert the final element into the permutation.
By the inductive hypothesis, Step 1 can be completed in $m!$ ways. Step 2 can be completed in $m+1$ ways, since there are $m+1$ locations into which the final element may be inserted. Therefore, we have (by the multiplication principle) that the number of permutations of an $m+1$ element set is equal to $m! cdot (m+1)$, which is the desired $(m+1)!$.
edited Jan 3 at 19:23
answered Dec 30 '18 at 21:40
Austin MohrAustin Mohr
20.9k35299
20.9k35299
$begingroup$
Thanks for the answer, Austin. I'm having trouble accepting your reasoning as a proof. My problem with your reasoning is the same as that which I have with Siong's (although, it was my perception that Siong was just giving hints rather than a full proof): you conclude that we therefore have $m! (m + 1)$ permutations without justifying this fact (without justifying how you got $m! (m + 1)$), which seems to me to invalidate it being a proof? What my proof does is justify how we get $m! (m + 1)$ (by utilising the multiplication theorem). It is more elaborate, but that's because [...]
$endgroup$
– The Pointer
Dec 31 '18 at 10:33
$begingroup$
[...] it's more rigorous, right?
$endgroup$
– The Pointer
Dec 31 '18 at 10:33
2
$begingroup$
@ThePointer There are many good articles available discussing what constitutes a 'proof' in mathematics (and what level of formalism is appropriate in different situations). When we talk in general about proofs (such as the majority of those on this site), we are talking about a 'semi-formal' proof which is may be supported by 'obvious' facts/intuition and well established results rather than an actual formal proof which could be checked by a computer, say. A proof which is 'more formal' is not necessarily better or more valid. [...]
$endgroup$
– John Don
Dec 31 '18 at 16:37
1
$begingroup$
@ThePointer [...] Moreover, if you want to be strict about the level of formalism, then really you should have a set of facts (axioms) that can be assumed (e.g. you are using the rule of product) as an established fact. Without this it hard to know what you would consider "too rigorous" or "not rigorous enough". That is not to say that it is not worth being aware of how pieces of intuition can be formally justified (see this for example).
$endgroup$
– John Don
Dec 31 '18 at 16:40
1
$begingroup$
@ThePointer That being said though, accepting that "it can be done formally" is IMO unsatisfying. You should read a proofs textbook and write some fully formal proofs, so that most importantly you can confirm what is "obviously true" is actually true, and that you can accept "it can be done formally (but is tedious!)" more confidently.
$endgroup$
– palmpo
Jan 4 at 0:22
|
show 4 more comments
$begingroup$
Thanks for the answer, Austin. I'm having trouble accepting your reasoning as a proof. My problem with your reasoning is the same as that which I have with Siong's (although, it was my perception that Siong was just giving hints rather than a full proof): you conclude that we therefore have $m! (m + 1)$ permutations without justifying this fact (without justifying how you got $m! (m + 1)$), which seems to me to invalidate it being a proof? What my proof does is justify how we get $m! (m + 1)$ (by utilising the multiplication theorem). It is more elaborate, but that's because [...]
$endgroup$
– The Pointer
Dec 31 '18 at 10:33
$begingroup$
[...] it's more rigorous, right?
$endgroup$
– The Pointer
Dec 31 '18 at 10:33
2
$begingroup$
@ThePointer There are many good articles available discussing what constitutes a 'proof' in mathematics (and what level of formalism is appropriate in different situations). When we talk in general about proofs (such as the majority of those on this site), we are talking about a 'semi-formal' proof which is may be supported by 'obvious' facts/intuition and well established results rather than an actual formal proof which could be checked by a computer, say. A proof which is 'more formal' is not necessarily better or more valid. [...]
$endgroup$
– John Don
Dec 31 '18 at 16:37
1
$begingroup$
@ThePointer [...] Moreover, if you want to be strict about the level of formalism, then really you should have a set of facts (axioms) that can be assumed (e.g. you are using the rule of product) as an established fact. Without this it hard to know what you would consider "too rigorous" or "not rigorous enough". That is not to say that it is not worth being aware of how pieces of intuition can be formally justified (see this for example).
$endgroup$
– John Don
Dec 31 '18 at 16:40
1
$begingroup$
@ThePointer That being said though, accepting that "it can be done formally" is IMO unsatisfying. You should read a proofs textbook and write some fully formal proofs, so that most importantly you can confirm what is "obviously true" is actually true, and that you can accept "it can be done formally (but is tedious!)" more confidently.
$endgroup$
– palmpo
Jan 4 at 0:22
$begingroup$
Thanks for the answer, Austin. I'm having trouble accepting your reasoning as a proof. My problem with your reasoning is the same as that which I have with Siong's (although, it was my perception that Siong was just giving hints rather than a full proof): you conclude that we therefore have $m! (m + 1)$ permutations without justifying this fact (without justifying how you got $m! (m + 1)$), which seems to me to invalidate it being a proof? What my proof does is justify how we get $m! (m + 1)$ (by utilising the multiplication theorem). It is more elaborate, but that's because [...]
$endgroup$
– The Pointer
Dec 31 '18 at 10:33
$begingroup$
Thanks for the answer, Austin. I'm having trouble accepting your reasoning as a proof. My problem with your reasoning is the same as that which I have with Siong's (although, it was my perception that Siong was just giving hints rather than a full proof): you conclude that we therefore have $m! (m + 1)$ permutations without justifying this fact (without justifying how you got $m! (m + 1)$), which seems to me to invalidate it being a proof? What my proof does is justify how we get $m! (m + 1)$ (by utilising the multiplication theorem). It is more elaborate, but that's because [...]
$endgroup$
– The Pointer
Dec 31 '18 at 10:33
$begingroup$
[...] it's more rigorous, right?
$endgroup$
– The Pointer
Dec 31 '18 at 10:33
$begingroup$
[...] it's more rigorous, right?
$endgroup$
– The Pointer
Dec 31 '18 at 10:33
2
2
$begingroup$
@ThePointer There are many good articles available discussing what constitutes a 'proof' in mathematics (and what level of formalism is appropriate in different situations). When we talk in general about proofs (such as the majority of those on this site), we are talking about a 'semi-formal' proof which is may be supported by 'obvious' facts/intuition and well established results rather than an actual formal proof which could be checked by a computer, say. A proof which is 'more formal' is not necessarily better or more valid. [...]
$endgroup$
– John Don
Dec 31 '18 at 16:37
$begingroup$
@ThePointer There are many good articles available discussing what constitutes a 'proof' in mathematics (and what level of formalism is appropriate in different situations). When we talk in general about proofs (such as the majority of those on this site), we are talking about a 'semi-formal' proof which is may be supported by 'obvious' facts/intuition and well established results rather than an actual formal proof which could be checked by a computer, say. A proof which is 'more formal' is not necessarily better or more valid. [...]
$endgroup$
– John Don
Dec 31 '18 at 16:37
1
1
$begingroup$
@ThePointer [...] Moreover, if you want to be strict about the level of formalism, then really you should have a set of facts (axioms) that can be assumed (e.g. you are using the rule of product) as an established fact. Without this it hard to know what you would consider "too rigorous" or "not rigorous enough". That is not to say that it is not worth being aware of how pieces of intuition can be formally justified (see this for example).
$endgroup$
– John Don
Dec 31 '18 at 16:40
$begingroup$
@ThePointer [...] Moreover, if you want to be strict about the level of formalism, then really you should have a set of facts (axioms) that can be assumed (e.g. you are using the rule of product) as an established fact. Without this it hard to know what you would consider "too rigorous" or "not rigorous enough". That is not to say that it is not worth being aware of how pieces of intuition can be formally justified (see this for example).
$endgroup$
– John Don
Dec 31 '18 at 16:40
1
1
$begingroup$
@ThePointer That being said though, accepting that "it can be done formally" is IMO unsatisfying. You should read a proofs textbook and write some fully formal proofs, so that most importantly you can confirm what is "obviously true" is actually true, and that you can accept "it can be done formally (but is tedious!)" more confidently.
$endgroup$
– palmpo
Jan 4 at 0:22
$begingroup$
@ThePointer That being said though, accepting that "it can be done formally" is IMO unsatisfying. You should read a proofs textbook and write some fully formal proofs, so that most importantly you can confirm what is "obviously true" is actually true, and that you can accept "it can be done formally (but is tedious!)" more confidently.
$endgroup$
– palmpo
Jan 4 at 0:22
|
show 4 more comments
$begingroup$
When we are given $m+1$ items, we separate a special item from the other $m$ items. We sort the $m$ items and fix their order, that give us $m!$ options.
Now we have $m+1$ positions to choose to place our special items and none of them repeats.
Hence we have $m! cdot (m+1)= (m+1)!$
$endgroup$
$begingroup$
Thanks for the answer. I’m confused by this; can you please elaborate?
$endgroup$
– The Pointer
Dec 28 '18 at 16:50
$begingroup$
Suppose you have figure out the number of permutations for $5$ items is $5!$. Now, given the $6$th, you sort the $5$ items first, now you have $6$ options to place the $6$ items right? To visulize, let the configuration of the $5$ items be your fingers, you can see that you have $6$ places to choose for the new item.
$endgroup$
– Siong Thye Goh
Dec 28 '18 at 17:02
$begingroup$
Ok, thanks for the clarification. I have edited my post with a complete proof.
$endgroup$
– The Pointer
Dec 30 '18 at 5:41
add a comment |
$begingroup$
When we are given $m+1$ items, we separate a special item from the other $m$ items. We sort the $m$ items and fix their order, that give us $m!$ options.
Now we have $m+1$ positions to choose to place our special items and none of them repeats.
Hence we have $m! cdot (m+1)= (m+1)!$
$endgroup$
$begingroup$
Thanks for the answer. I’m confused by this; can you please elaborate?
$endgroup$
– The Pointer
Dec 28 '18 at 16:50
$begingroup$
Suppose you have figure out the number of permutations for $5$ items is $5!$. Now, given the $6$th, you sort the $5$ items first, now you have $6$ options to place the $6$ items right? To visulize, let the configuration of the $5$ items be your fingers, you can see that you have $6$ places to choose for the new item.
$endgroup$
– Siong Thye Goh
Dec 28 '18 at 17:02
$begingroup$
Ok, thanks for the clarification. I have edited my post with a complete proof.
$endgroup$
– The Pointer
Dec 30 '18 at 5:41
add a comment |
$begingroup$
When we are given $m+1$ items, we separate a special item from the other $m$ items. We sort the $m$ items and fix their order, that give us $m!$ options.
Now we have $m+1$ positions to choose to place our special items and none of them repeats.
Hence we have $m! cdot (m+1)= (m+1)!$
$endgroup$
When we are given $m+1$ items, we separate a special item from the other $m$ items. We sort the $m$ items and fix their order, that give us $m!$ options.
Now we have $m+1$ positions to choose to place our special items and none of them repeats.
Hence we have $m! cdot (m+1)= (m+1)!$
answered Dec 28 '18 at 16:22
Siong Thye GohSiong Thye Goh
104k1468120
104k1468120
$begingroup$
Thanks for the answer. I’m confused by this; can you please elaborate?
$endgroup$
– The Pointer
Dec 28 '18 at 16:50
$begingroup$
Suppose you have figure out the number of permutations for $5$ items is $5!$. Now, given the $6$th, you sort the $5$ items first, now you have $6$ options to place the $6$ items right? To visulize, let the configuration of the $5$ items be your fingers, you can see that you have $6$ places to choose for the new item.
$endgroup$
– Siong Thye Goh
Dec 28 '18 at 17:02
$begingroup$
Ok, thanks for the clarification. I have edited my post with a complete proof.
$endgroup$
– The Pointer
Dec 30 '18 at 5:41
add a comment |
$begingroup$
Thanks for the answer. I’m confused by this; can you please elaborate?
$endgroup$
– The Pointer
Dec 28 '18 at 16:50
$begingroup$
Suppose you have figure out the number of permutations for $5$ items is $5!$. Now, given the $6$th, you sort the $5$ items first, now you have $6$ options to place the $6$ items right? To visulize, let the configuration of the $5$ items be your fingers, you can see that you have $6$ places to choose for the new item.
$endgroup$
– Siong Thye Goh
Dec 28 '18 at 17:02
$begingroup$
Ok, thanks for the clarification. I have edited my post with a complete proof.
$endgroup$
– The Pointer
Dec 30 '18 at 5:41
$begingroup$
Thanks for the answer. I’m confused by this; can you please elaborate?
$endgroup$
– The Pointer
Dec 28 '18 at 16:50
$begingroup$
Thanks for the answer. I’m confused by this; can you please elaborate?
$endgroup$
– The Pointer
Dec 28 '18 at 16:50
$begingroup$
Suppose you have figure out the number of permutations for $5$ items is $5!$. Now, given the $6$th, you sort the $5$ items first, now you have $6$ options to place the $6$ items right? To visulize, let the configuration of the $5$ items be your fingers, you can see that you have $6$ places to choose for the new item.
$endgroup$
– Siong Thye Goh
Dec 28 '18 at 17:02
$begingroup$
Suppose you have figure out the number of permutations for $5$ items is $5!$. Now, given the $6$th, you sort the $5$ items first, now you have $6$ options to place the $6$ items right? To visulize, let the configuration of the $5$ items be your fingers, you can see that you have $6$ places to choose for the new item.
$endgroup$
– Siong Thye Goh
Dec 28 '18 at 17:02
$begingroup$
Ok, thanks for the clarification. I have edited my post with a complete proof.
$endgroup$
– The Pointer
Dec 30 '18 at 5:41
$begingroup$
Ok, thanks for the clarification. I have edited my post with a complete proof.
$endgroup$
– The Pointer
Dec 30 '18 at 5:41
add a comment |
$begingroup$
Consider all the $m!$ permutations of the $m$ first elements and insert the $m+1^{th}$ in all the possible places. As there are $m+1$ possible places per permutation, we get a total of $(m+1)m!$.
Remains to show that we didn't forget any new permutation nor repeat any.
by hypothesis, we started with all permutations of the first $m$; a permutation of the $m+1$ is perforce a permutation of the $m$ with the $m+1^{th}$ inserted somewhere. We tried all insertion points, nothing is missed.
all permutations of $m$ are distinct. All permutations obtained by inserting the new element in position $k$ are distinct from each other, and distinct for all $k$. Thus all new permutations are distinct.
$endgroup$
$begingroup$
For the OP's benefit: This is a high level summary of what I am trying to flesh out in my answer.. 'didn't forget any = surjectivity' and 'nor repeat any' = injectivity. (+1)
$endgroup$
– CopyPasteIt
Dec 31 '18 at 16:52
add a comment |
$begingroup$
Consider all the $m!$ permutations of the $m$ first elements and insert the $m+1^{th}$ in all the possible places. As there are $m+1$ possible places per permutation, we get a total of $(m+1)m!$.
Remains to show that we didn't forget any new permutation nor repeat any.
by hypothesis, we started with all permutations of the first $m$; a permutation of the $m+1$ is perforce a permutation of the $m$ with the $m+1^{th}$ inserted somewhere. We tried all insertion points, nothing is missed.
all permutations of $m$ are distinct. All permutations obtained by inserting the new element in position $k$ are distinct from each other, and distinct for all $k$. Thus all new permutations are distinct.
$endgroup$
$begingroup$
For the OP's benefit: This is a high level summary of what I am trying to flesh out in my answer.. 'didn't forget any = surjectivity' and 'nor repeat any' = injectivity. (+1)
$endgroup$
– CopyPasteIt
Dec 31 '18 at 16:52
add a comment |
$begingroup$
Consider all the $m!$ permutations of the $m$ first elements and insert the $m+1^{th}$ in all the possible places. As there are $m+1$ possible places per permutation, we get a total of $(m+1)m!$.
Remains to show that we didn't forget any new permutation nor repeat any.
by hypothesis, we started with all permutations of the first $m$; a permutation of the $m+1$ is perforce a permutation of the $m$ with the $m+1^{th}$ inserted somewhere. We tried all insertion points, nothing is missed.
all permutations of $m$ are distinct. All permutations obtained by inserting the new element in position $k$ are distinct from each other, and distinct for all $k$. Thus all new permutations are distinct.
$endgroup$
Consider all the $m!$ permutations of the $m$ first elements and insert the $m+1^{th}$ in all the possible places. As there are $m+1$ possible places per permutation, we get a total of $(m+1)m!$.
Remains to show that we didn't forget any new permutation nor repeat any.
by hypothesis, we started with all permutations of the first $m$; a permutation of the $m+1$ is perforce a permutation of the $m$ with the $m+1^{th}$ inserted somewhere. We tried all insertion points, nothing is missed.
all permutations of $m$ are distinct. All permutations obtained by inserting the new element in position $k$ are distinct from each other, and distinct for all $k$. Thus all new permutations are distinct.
answered Dec 31 '18 at 16:25
Yves DaoustYves Daoust
133k676232
133k676232
$begingroup$
For the OP's benefit: This is a high level summary of what I am trying to flesh out in my answer.. 'didn't forget any = surjectivity' and 'nor repeat any' = injectivity. (+1)
$endgroup$
– CopyPasteIt
Dec 31 '18 at 16:52
add a comment |
$begingroup$
For the OP's benefit: This is a high level summary of what I am trying to flesh out in my answer.. 'didn't forget any = surjectivity' and 'nor repeat any' = injectivity. (+1)
$endgroup$
– CopyPasteIt
Dec 31 '18 at 16:52
$begingroup$
For the OP's benefit: This is a high level summary of what I am trying to flesh out in my answer.. 'didn't forget any = surjectivity' and 'nor repeat any' = injectivity. (+1)
$endgroup$
– CopyPasteIt
Dec 31 '18 at 16:52
$begingroup$
For the OP's benefit: This is a high level summary of what I am trying to flesh out in my answer.. 'didn't forget any = surjectivity' and 'nor repeat any' = injectivity. (+1)
$endgroup$
– CopyPasteIt
Dec 31 '18 at 16:52
add a comment |
$begingroup$
Any rigorous proof that involves induction has to develop some rudimentary theory for the set of all permutations on a set. The fact that the result is so well known and can be directly proven using the rule of product does not mean that proof details can be glossed over.
Let $X$ be a finite set containing $n$ elements with symmetric group $text{Sym}(X)$, i.e. the set of all permutations of $X$. Let $x' notin X$ and set $X' = X cup {x'}$ so that there is a natural inclusion $text{Sym}(X) subset text{Sym}(X')$ - you extend any permutation $rho$ on $X$ to all of $X'$ by defining $rho(x') = x'$.
If we have a set of permutations, we can generate a larger set by taking all possible products (i.e. functional composition) of the permutations. Whenever we generate these 'algebraic closures' we throw in the identity mapping for good measure.
Proposition 1: If $text{Sym}(X)$ is generated by the set of all transpositions then the set $text{Sym}(X')$ is also generated by the set of transpositions.
Proof
Let $f in text{Sym}(X')$ and set $tau = left(,x' ;; f^{-1}(x'),right)$ and $g = f circ tau$. It is immediate that $g(x') = x'$ so that $g$ is a product of transpositions. But then $f = g circ tau^{-1}$ is also the composition of transpositions, since $tau^{-1} = tau$. $quad blacksquare$
Note: In the preceding proof, $tau$ can represent the identity. So we are abusing the notation used for a transposition, but this is convenience that we will continue to implicitly employ in what follows.
In general, the following equations hold true for transpositions:
begin{align}(T1) quad (ab)(ab)&=1_{Id}\
(T2) quad (ab)(cd)&=(cd)(ab)\
(T3) quad (ab)(ac)&=(bc)(ab)\
(T4) quad (ab)(bc)&=(bc)(ac)end{align}
If $f in text{Sym}(X')$ is a product of transpositions, locate the first one $tau$ from the left side (if any) that swaps $x'$ with another element and start 'pushing it' step-by-step to the far right using
$text{(T1)}$ thru $text{(T3)}$ . When using $text{(T2)}$ or $text{(T3)}$ progress is made as $tau$ is closer to the right and none of the transpositions on the left of $tau$ swap $x'$. If at any step $text{(T1)}$ comes into play, then the 'composition string' is reduced by $2$ transpositions (progress) and you have to start over.
When this pushing algorithm has ended, we can write
$tag 1 f =g circ left(,x' ;; x,right) , text{ with } (x in X) ;land; (g in text{Sym}(X))$.
Let $mathcal S$ be the collection of all transpositions in $text{Sym}(X')$ swapping $x'$ with another element. Again, as a convenience the identity is a member of $mathcal S$ and so $text{#}(mathcal S) = n + 1$.
Proposition 2: The mapping $left(g,tauright) mapsto g circ tau$ defines a bijective correspondence
$quad text{Sym}(X) times mathcal S equiv text{Sym}(X')$.
Proof
We've already demonstrated that the mapping is a surjection and the injectivity is an easy argument. $blacksquare$
As easy consequence of proposition 2 is that
$quad text{#}left(text{Sym}(X')right) =text{#} left( text{Sym}(X) times mathcal S right)= text{#} left( text{Sym}(X) right) times text{#} left(mathcal S right) = text{#} left( text{Sym}(X) right) times (n+1) $
The following can be derived using induction and the above theory:
Theorem 3: If $X$ has $n$ elements then $text{Sym}(X)$ contains $n!$ elements.
$endgroup$
$begingroup$
Thanks for the answer. The level of rigour presented in your proof is way beyond what a textbook of this level expects. But I take it that you agree that my proof is valid/correct (albeit, less rigorous than the one you have presented)?
$endgroup$
– The Pointer
Dec 31 '18 at 15:35
$begingroup$
@ThePointer I have difficulty understanding what a task is. But can you follow the 'gist' of my argument? It may appear formal but the idea is straightforward. Does your textbook present transpositions? I thought it would be helpful to prove that every permutation is a product of transposition using induction (usually one breaks down the cycles into transpositions).
$endgroup$
– CopyPasteIt
Dec 31 '18 at 15:42
$begingroup$
A "task" is as presented in the multiplication theorem: If there are $n_k$ ways to perform task $T_k$ for $k=1,dots,m$, then there are $n_1n_2dots n_m$ ways to perform all $m$ tasks.
$endgroup$
– The Pointer
Dec 31 '18 at 15:43
$begingroup$
And this is a discrete mathematics textbook (often used by computer scientists), so it doesn't introduce transpositions. It's a pretty basic-level textbook -- certainly not rigorous like textbooks for mathematicians are.
$endgroup$
– The Pointer
Dec 31 '18 at 15:44
1
$begingroup$
Yes, the textbook requests proof by induction and said it will "leave the proof to the author as an exercise", which is why I decided to do it. It actually presented the product rule in the same chapter (chapter on combinatorics), immediately before this theorem, so I thought it would perhaps be natural to use it in the proof.
$endgroup$
– The Pointer
Dec 31 '18 at 15:52
|
show 6 more comments
$begingroup$
Any rigorous proof that involves induction has to develop some rudimentary theory for the set of all permutations on a set. The fact that the result is so well known and can be directly proven using the rule of product does not mean that proof details can be glossed over.
Let $X$ be a finite set containing $n$ elements with symmetric group $text{Sym}(X)$, i.e. the set of all permutations of $X$. Let $x' notin X$ and set $X' = X cup {x'}$ so that there is a natural inclusion $text{Sym}(X) subset text{Sym}(X')$ - you extend any permutation $rho$ on $X$ to all of $X'$ by defining $rho(x') = x'$.
If we have a set of permutations, we can generate a larger set by taking all possible products (i.e. functional composition) of the permutations. Whenever we generate these 'algebraic closures' we throw in the identity mapping for good measure.
Proposition 1: If $text{Sym}(X)$ is generated by the set of all transpositions then the set $text{Sym}(X')$ is also generated by the set of transpositions.
Proof
Let $f in text{Sym}(X')$ and set $tau = left(,x' ;; f^{-1}(x'),right)$ and $g = f circ tau$. It is immediate that $g(x') = x'$ so that $g$ is a product of transpositions. But then $f = g circ tau^{-1}$ is also the composition of transpositions, since $tau^{-1} = tau$. $quad blacksquare$
Note: In the preceding proof, $tau$ can represent the identity. So we are abusing the notation used for a transposition, but this is convenience that we will continue to implicitly employ in what follows.
In general, the following equations hold true for transpositions:
begin{align}(T1) quad (ab)(ab)&=1_{Id}\
(T2) quad (ab)(cd)&=(cd)(ab)\
(T3) quad (ab)(ac)&=(bc)(ab)\
(T4) quad (ab)(bc)&=(bc)(ac)end{align}
If $f in text{Sym}(X')$ is a product of transpositions, locate the first one $tau$ from the left side (if any) that swaps $x'$ with another element and start 'pushing it' step-by-step to the far right using
$text{(T1)}$ thru $text{(T3)}$ . When using $text{(T2)}$ or $text{(T3)}$ progress is made as $tau$ is closer to the right and none of the transpositions on the left of $tau$ swap $x'$. If at any step $text{(T1)}$ comes into play, then the 'composition string' is reduced by $2$ transpositions (progress) and you have to start over.
When this pushing algorithm has ended, we can write
$tag 1 f =g circ left(,x' ;; x,right) , text{ with } (x in X) ;land; (g in text{Sym}(X))$.
Let $mathcal S$ be the collection of all transpositions in $text{Sym}(X')$ swapping $x'$ with another element. Again, as a convenience the identity is a member of $mathcal S$ and so $text{#}(mathcal S) = n + 1$.
Proposition 2: The mapping $left(g,tauright) mapsto g circ tau$ defines a bijective correspondence
$quad text{Sym}(X) times mathcal S equiv text{Sym}(X')$.
Proof
We've already demonstrated that the mapping is a surjection and the injectivity is an easy argument. $blacksquare$
As easy consequence of proposition 2 is that
$quad text{#}left(text{Sym}(X')right) =text{#} left( text{Sym}(X) times mathcal S right)= text{#} left( text{Sym}(X) right) times text{#} left(mathcal S right) = text{#} left( text{Sym}(X) right) times (n+1) $
The following can be derived using induction and the above theory:
Theorem 3: If $X$ has $n$ elements then $text{Sym}(X)$ contains $n!$ elements.
$endgroup$
$begingroup$
Thanks for the answer. The level of rigour presented in your proof is way beyond what a textbook of this level expects. But I take it that you agree that my proof is valid/correct (albeit, less rigorous than the one you have presented)?
$endgroup$
– The Pointer
Dec 31 '18 at 15:35
$begingroup$
@ThePointer I have difficulty understanding what a task is. But can you follow the 'gist' of my argument? It may appear formal but the idea is straightforward. Does your textbook present transpositions? I thought it would be helpful to prove that every permutation is a product of transposition using induction (usually one breaks down the cycles into transpositions).
$endgroup$
– CopyPasteIt
Dec 31 '18 at 15:42
$begingroup$
A "task" is as presented in the multiplication theorem: If there are $n_k$ ways to perform task $T_k$ for $k=1,dots,m$, then there are $n_1n_2dots n_m$ ways to perform all $m$ tasks.
$endgroup$
– The Pointer
Dec 31 '18 at 15:43
$begingroup$
And this is a discrete mathematics textbook (often used by computer scientists), so it doesn't introduce transpositions. It's a pretty basic-level textbook -- certainly not rigorous like textbooks for mathematicians are.
$endgroup$
– The Pointer
Dec 31 '18 at 15:44
1
$begingroup$
Yes, the textbook requests proof by induction and said it will "leave the proof to the author as an exercise", which is why I decided to do it. It actually presented the product rule in the same chapter (chapter on combinatorics), immediately before this theorem, so I thought it would perhaps be natural to use it in the proof.
$endgroup$
– The Pointer
Dec 31 '18 at 15:52
|
show 6 more comments
$begingroup$
Any rigorous proof that involves induction has to develop some rudimentary theory for the set of all permutations on a set. The fact that the result is so well known and can be directly proven using the rule of product does not mean that proof details can be glossed over.
Let $X$ be a finite set containing $n$ elements with symmetric group $text{Sym}(X)$, i.e. the set of all permutations of $X$. Let $x' notin X$ and set $X' = X cup {x'}$ so that there is a natural inclusion $text{Sym}(X) subset text{Sym}(X')$ - you extend any permutation $rho$ on $X$ to all of $X'$ by defining $rho(x') = x'$.
If we have a set of permutations, we can generate a larger set by taking all possible products (i.e. functional composition) of the permutations. Whenever we generate these 'algebraic closures' we throw in the identity mapping for good measure.
Proposition 1: If $text{Sym}(X)$ is generated by the set of all transpositions then the set $text{Sym}(X')$ is also generated by the set of transpositions.
Proof
Let $f in text{Sym}(X')$ and set $tau = left(,x' ;; f^{-1}(x'),right)$ and $g = f circ tau$. It is immediate that $g(x') = x'$ so that $g$ is a product of transpositions. But then $f = g circ tau^{-1}$ is also the composition of transpositions, since $tau^{-1} = tau$. $quad blacksquare$
Note: In the preceding proof, $tau$ can represent the identity. So we are abusing the notation used for a transposition, but this is convenience that we will continue to implicitly employ in what follows.
In general, the following equations hold true for transpositions:
begin{align}(T1) quad (ab)(ab)&=1_{Id}\
(T2) quad (ab)(cd)&=(cd)(ab)\
(T3) quad (ab)(ac)&=(bc)(ab)\
(T4) quad (ab)(bc)&=(bc)(ac)end{align}
If $f in text{Sym}(X')$ is a product of transpositions, locate the first one $tau$ from the left side (if any) that swaps $x'$ with another element and start 'pushing it' step-by-step to the far right using
$text{(T1)}$ thru $text{(T3)}$ . When using $text{(T2)}$ or $text{(T3)}$ progress is made as $tau$ is closer to the right and none of the transpositions on the left of $tau$ swap $x'$. If at any step $text{(T1)}$ comes into play, then the 'composition string' is reduced by $2$ transpositions (progress) and you have to start over.
When this pushing algorithm has ended, we can write
$tag 1 f =g circ left(,x' ;; x,right) , text{ with } (x in X) ;land; (g in text{Sym}(X))$.
Let $mathcal S$ be the collection of all transpositions in $text{Sym}(X')$ swapping $x'$ with another element. Again, as a convenience the identity is a member of $mathcal S$ and so $text{#}(mathcal S) = n + 1$.
Proposition 2: The mapping $left(g,tauright) mapsto g circ tau$ defines a bijective correspondence
$quad text{Sym}(X) times mathcal S equiv text{Sym}(X')$.
Proof
We've already demonstrated that the mapping is a surjection and the injectivity is an easy argument. $blacksquare$
As easy consequence of proposition 2 is that
$quad text{#}left(text{Sym}(X')right) =text{#} left( text{Sym}(X) times mathcal S right)= text{#} left( text{Sym}(X) right) times text{#} left(mathcal S right) = text{#} left( text{Sym}(X) right) times (n+1) $
The following can be derived using induction and the above theory:
Theorem 3: If $X$ has $n$ elements then $text{Sym}(X)$ contains $n!$ elements.
$endgroup$
Any rigorous proof that involves induction has to develop some rudimentary theory for the set of all permutations on a set. The fact that the result is so well known and can be directly proven using the rule of product does not mean that proof details can be glossed over.
Let $X$ be a finite set containing $n$ elements with symmetric group $text{Sym}(X)$, i.e. the set of all permutations of $X$. Let $x' notin X$ and set $X' = X cup {x'}$ so that there is a natural inclusion $text{Sym}(X) subset text{Sym}(X')$ - you extend any permutation $rho$ on $X$ to all of $X'$ by defining $rho(x') = x'$.
If we have a set of permutations, we can generate a larger set by taking all possible products (i.e. functional composition) of the permutations. Whenever we generate these 'algebraic closures' we throw in the identity mapping for good measure.
Proposition 1: If $text{Sym}(X)$ is generated by the set of all transpositions then the set $text{Sym}(X')$ is also generated by the set of transpositions.
Proof
Let $f in text{Sym}(X')$ and set $tau = left(,x' ;; f^{-1}(x'),right)$ and $g = f circ tau$. It is immediate that $g(x') = x'$ so that $g$ is a product of transpositions. But then $f = g circ tau^{-1}$ is also the composition of transpositions, since $tau^{-1} = tau$. $quad blacksquare$
Note: In the preceding proof, $tau$ can represent the identity. So we are abusing the notation used for a transposition, but this is convenience that we will continue to implicitly employ in what follows.
In general, the following equations hold true for transpositions:
begin{align}(T1) quad (ab)(ab)&=1_{Id}\
(T2) quad (ab)(cd)&=(cd)(ab)\
(T3) quad (ab)(ac)&=(bc)(ab)\
(T4) quad (ab)(bc)&=(bc)(ac)end{align}
If $f in text{Sym}(X')$ is a product of transpositions, locate the first one $tau$ from the left side (if any) that swaps $x'$ with another element and start 'pushing it' step-by-step to the far right using
$text{(T1)}$ thru $text{(T3)}$ . When using $text{(T2)}$ or $text{(T3)}$ progress is made as $tau$ is closer to the right and none of the transpositions on the left of $tau$ swap $x'$. If at any step $text{(T1)}$ comes into play, then the 'composition string' is reduced by $2$ transpositions (progress) and you have to start over.
When this pushing algorithm has ended, we can write
$tag 1 f =g circ left(,x' ;; x,right) , text{ with } (x in X) ;land; (g in text{Sym}(X))$.
Let $mathcal S$ be the collection of all transpositions in $text{Sym}(X')$ swapping $x'$ with another element. Again, as a convenience the identity is a member of $mathcal S$ and so $text{#}(mathcal S) = n + 1$.
Proposition 2: The mapping $left(g,tauright) mapsto g circ tau$ defines a bijective correspondence
$quad text{Sym}(X) times mathcal S equiv text{Sym}(X')$.
Proof
We've already demonstrated that the mapping is a surjection and the injectivity is an easy argument. $blacksquare$
As easy consequence of proposition 2 is that
$quad text{#}left(text{Sym}(X')right) =text{#} left( text{Sym}(X) times mathcal S right)= text{#} left( text{Sym}(X) right) times text{#} left(mathcal S right) = text{#} left( text{Sym}(X) right) times (n+1) $
The following can be derived using induction and the above theory:
Theorem 3: If $X$ has $n$ elements then $text{Sym}(X)$ contains $n!$ elements.
edited Dec 31 '18 at 17:25
answered Dec 31 '18 at 12:40
CopyPasteItCopyPasteIt
4,3671828
4,3671828
$begingroup$
Thanks for the answer. The level of rigour presented in your proof is way beyond what a textbook of this level expects. But I take it that you agree that my proof is valid/correct (albeit, less rigorous than the one you have presented)?
$endgroup$
– The Pointer
Dec 31 '18 at 15:35
$begingroup$
@ThePointer I have difficulty understanding what a task is. But can you follow the 'gist' of my argument? It may appear formal but the idea is straightforward. Does your textbook present transpositions? I thought it would be helpful to prove that every permutation is a product of transposition using induction (usually one breaks down the cycles into transpositions).
$endgroup$
– CopyPasteIt
Dec 31 '18 at 15:42
$begingroup$
A "task" is as presented in the multiplication theorem: If there are $n_k$ ways to perform task $T_k$ for $k=1,dots,m$, then there are $n_1n_2dots n_m$ ways to perform all $m$ tasks.
$endgroup$
– The Pointer
Dec 31 '18 at 15:43
$begingroup$
And this is a discrete mathematics textbook (often used by computer scientists), so it doesn't introduce transpositions. It's a pretty basic-level textbook -- certainly not rigorous like textbooks for mathematicians are.
$endgroup$
– The Pointer
Dec 31 '18 at 15:44
1
$begingroup$
Yes, the textbook requests proof by induction and said it will "leave the proof to the author as an exercise", which is why I decided to do it. It actually presented the product rule in the same chapter (chapter on combinatorics), immediately before this theorem, so I thought it would perhaps be natural to use it in the proof.
$endgroup$
– The Pointer
Dec 31 '18 at 15:52
|
show 6 more comments
$begingroup$
Thanks for the answer. The level of rigour presented in your proof is way beyond what a textbook of this level expects. But I take it that you agree that my proof is valid/correct (albeit, less rigorous than the one you have presented)?
$endgroup$
– The Pointer
Dec 31 '18 at 15:35
$begingroup$
@ThePointer I have difficulty understanding what a task is. But can you follow the 'gist' of my argument? It may appear formal but the idea is straightforward. Does your textbook present transpositions? I thought it would be helpful to prove that every permutation is a product of transposition using induction (usually one breaks down the cycles into transpositions).
$endgroup$
– CopyPasteIt
Dec 31 '18 at 15:42
$begingroup$
A "task" is as presented in the multiplication theorem: If there are $n_k$ ways to perform task $T_k$ for $k=1,dots,m$, then there are $n_1n_2dots n_m$ ways to perform all $m$ tasks.
$endgroup$
– The Pointer
Dec 31 '18 at 15:43
$begingroup$
And this is a discrete mathematics textbook (often used by computer scientists), so it doesn't introduce transpositions. It's a pretty basic-level textbook -- certainly not rigorous like textbooks for mathematicians are.
$endgroup$
– The Pointer
Dec 31 '18 at 15:44
1
$begingroup$
Yes, the textbook requests proof by induction and said it will "leave the proof to the author as an exercise", which is why I decided to do it. It actually presented the product rule in the same chapter (chapter on combinatorics), immediately before this theorem, so I thought it would perhaps be natural to use it in the proof.
$endgroup$
– The Pointer
Dec 31 '18 at 15:52
$begingroup$
Thanks for the answer. The level of rigour presented in your proof is way beyond what a textbook of this level expects. But I take it that you agree that my proof is valid/correct (albeit, less rigorous than the one you have presented)?
$endgroup$
– The Pointer
Dec 31 '18 at 15:35
$begingroup$
Thanks for the answer. The level of rigour presented in your proof is way beyond what a textbook of this level expects. But I take it that you agree that my proof is valid/correct (albeit, less rigorous than the one you have presented)?
$endgroup$
– The Pointer
Dec 31 '18 at 15:35
$begingroup$
@ThePointer I have difficulty understanding what a task is. But can you follow the 'gist' of my argument? It may appear formal but the idea is straightforward. Does your textbook present transpositions? I thought it would be helpful to prove that every permutation is a product of transposition using induction (usually one breaks down the cycles into transpositions).
$endgroup$
– CopyPasteIt
Dec 31 '18 at 15:42
$begingroup$
@ThePointer I have difficulty understanding what a task is. But can you follow the 'gist' of my argument? It may appear formal but the idea is straightforward. Does your textbook present transpositions? I thought it would be helpful to prove that every permutation is a product of transposition using induction (usually one breaks down the cycles into transpositions).
$endgroup$
– CopyPasteIt
Dec 31 '18 at 15:42
$begingroup$
A "task" is as presented in the multiplication theorem: If there are $n_k$ ways to perform task $T_k$ for $k=1,dots,m$, then there are $n_1n_2dots n_m$ ways to perform all $m$ tasks.
$endgroup$
– The Pointer
Dec 31 '18 at 15:43
$begingroup$
A "task" is as presented in the multiplication theorem: If there are $n_k$ ways to perform task $T_k$ for $k=1,dots,m$, then there are $n_1n_2dots n_m$ ways to perform all $m$ tasks.
$endgroup$
– The Pointer
Dec 31 '18 at 15:43
$begingroup$
And this is a discrete mathematics textbook (often used by computer scientists), so it doesn't introduce transpositions. It's a pretty basic-level textbook -- certainly not rigorous like textbooks for mathematicians are.
$endgroup$
– The Pointer
Dec 31 '18 at 15:44
$begingroup$
And this is a discrete mathematics textbook (often used by computer scientists), so it doesn't introduce transpositions. It's a pretty basic-level textbook -- certainly not rigorous like textbooks for mathematicians are.
$endgroup$
– The Pointer
Dec 31 '18 at 15:44
1
1
$begingroup$
Yes, the textbook requests proof by induction and said it will "leave the proof to the author as an exercise", which is why I decided to do it. It actually presented the product rule in the same chapter (chapter on combinatorics), immediately before this theorem, so I thought it would perhaps be natural to use it in the proof.
$endgroup$
– The Pointer
Dec 31 '18 at 15:52
$begingroup$
Yes, the textbook requests proof by induction and said it will "leave the proof to the author as an exercise", which is why I decided to do it. It actually presented the product rule in the same chapter (chapter on combinatorics), immediately before this theorem, so I thought it would perhaps be natural to use it in the proof.
$endgroup$
– The Pointer
Dec 31 '18 at 15:52
|
show 6 more comments
$begingroup$
This answer might be more appropriate for students studying discrete mathematics and interested in combinatorics and/or computer science.
One can regard permutations as 'scrambled words';
see Permutation: One line notation.
So with the ordered alphabet $mathcal A = {a,b,c,dots,z}$ here are the lists of scrambled permutation words:
Scrambling first 1 letters: $P_1 = {a}$
Scrambling first 2 letters: $P_2 = {ab,ba}$
Scrambling first 3 letters: $P_3 = {abc,acb,bac,cab,cba,bca}$
etc.
These ideas can be implemented in Python; here we create $P_3$:
>>> import itertools
>>> list(itertools.permutations(['a','b','c'],3))
[('a', 'b', 'c'), ('a', 'c', 'b'), ('b', 'a', 'c'), ('b', 'c', 'a'), ('c', 'a', 'b'), ('c', 'b', 'a')]
How many scrambled permutation words can be formed with the first $4$ letters, ${a,b,c,d}$?
There is a mapping from $P_4$ to $P_3$ defined by removing the letter $d$ from a word $pmb omega$ in $P_4$.
Examples: $abdc mapsto abc$; $;cdba mapsto cba$.
It is easy to see that $4$ words in $P_4$ get mapped to each word in $P_3$.
So the number of words in $P_4$ is equal to $4 times text{#}(P_3)$.
In general it can be shown that
$tag 1 text{#}(P_{n+1}) = (n+1) times text{#}(P_n)$
The above is all that is needed to get the $n!$ permutation count using induction.
$endgroup$
add a comment |
$begingroup$
This answer might be more appropriate for students studying discrete mathematics and interested in combinatorics and/or computer science.
One can regard permutations as 'scrambled words';
see Permutation: One line notation.
So with the ordered alphabet $mathcal A = {a,b,c,dots,z}$ here are the lists of scrambled permutation words:
Scrambling first 1 letters: $P_1 = {a}$
Scrambling first 2 letters: $P_2 = {ab,ba}$
Scrambling first 3 letters: $P_3 = {abc,acb,bac,cab,cba,bca}$
etc.
These ideas can be implemented in Python; here we create $P_3$:
>>> import itertools
>>> list(itertools.permutations(['a','b','c'],3))
[('a', 'b', 'c'), ('a', 'c', 'b'), ('b', 'a', 'c'), ('b', 'c', 'a'), ('c', 'a', 'b'), ('c', 'b', 'a')]
How many scrambled permutation words can be formed with the first $4$ letters, ${a,b,c,d}$?
There is a mapping from $P_4$ to $P_3$ defined by removing the letter $d$ from a word $pmb omega$ in $P_4$.
Examples: $abdc mapsto abc$; $;cdba mapsto cba$.
It is easy to see that $4$ words in $P_4$ get mapped to each word in $P_3$.
So the number of words in $P_4$ is equal to $4 times text{#}(P_3)$.
In general it can be shown that
$tag 1 text{#}(P_{n+1}) = (n+1) times text{#}(P_n)$
The above is all that is needed to get the $n!$ permutation count using induction.
$endgroup$
add a comment |
$begingroup$
This answer might be more appropriate for students studying discrete mathematics and interested in combinatorics and/or computer science.
One can regard permutations as 'scrambled words';
see Permutation: One line notation.
So with the ordered alphabet $mathcal A = {a,b,c,dots,z}$ here are the lists of scrambled permutation words:
Scrambling first 1 letters: $P_1 = {a}$
Scrambling first 2 letters: $P_2 = {ab,ba}$
Scrambling first 3 letters: $P_3 = {abc,acb,bac,cab,cba,bca}$
etc.
These ideas can be implemented in Python; here we create $P_3$:
>>> import itertools
>>> list(itertools.permutations(['a','b','c'],3))
[('a', 'b', 'c'), ('a', 'c', 'b'), ('b', 'a', 'c'), ('b', 'c', 'a'), ('c', 'a', 'b'), ('c', 'b', 'a')]
How many scrambled permutation words can be formed with the first $4$ letters, ${a,b,c,d}$?
There is a mapping from $P_4$ to $P_3$ defined by removing the letter $d$ from a word $pmb omega$ in $P_4$.
Examples: $abdc mapsto abc$; $;cdba mapsto cba$.
It is easy to see that $4$ words in $P_4$ get mapped to each word in $P_3$.
So the number of words in $P_4$ is equal to $4 times text{#}(P_3)$.
In general it can be shown that
$tag 1 text{#}(P_{n+1}) = (n+1) times text{#}(P_n)$
The above is all that is needed to get the $n!$ permutation count using induction.
$endgroup$
This answer might be more appropriate for students studying discrete mathematics and interested in combinatorics and/or computer science.
One can regard permutations as 'scrambled words';
see Permutation: One line notation.
So with the ordered alphabet $mathcal A = {a,b,c,dots,z}$ here are the lists of scrambled permutation words:
Scrambling first 1 letters: $P_1 = {a}$
Scrambling first 2 letters: $P_2 = {ab,ba}$
Scrambling first 3 letters: $P_3 = {abc,acb,bac,cab,cba,bca}$
etc.
These ideas can be implemented in Python; here we create $P_3$:
>>> import itertools
>>> list(itertools.permutations(['a','b','c'],3))
[('a', 'b', 'c'), ('a', 'c', 'b'), ('b', 'a', 'c'), ('b', 'c', 'a'), ('c', 'a', 'b'), ('c', 'b', 'a')]
How many scrambled permutation words can be formed with the first $4$ letters, ${a,b,c,d}$?
There is a mapping from $P_4$ to $P_3$ defined by removing the letter $d$ from a word $pmb omega$ in $P_4$.
Examples: $abdc mapsto abc$; $;cdba mapsto cba$.
It is easy to see that $4$ words in $P_4$ get mapped to each word in $P_3$.
So the number of words in $P_4$ is equal to $4 times text{#}(P_3)$.
In general it can be shown that
$tag 1 text{#}(P_{n+1}) = (n+1) times text{#}(P_n)$
The above is all that is needed to get the $n!$ permutation count using induction.
edited Jan 1 at 20:29
answered Jan 1 at 15:09
CopyPasteItCopyPasteIt
4,3671828
4,3671828
add a comment |
add a comment |
$begingroup$
Here's the basis of a proof I might do for this. I'm going to use a numerical argument to make my point clearer, but it will hold for a general $n$ too.
Suppose we assume that the statement holds true for $n=4$ and we use the letters $A,B,C,D$ as our four elements in the set.
One such listing might be $ABCD$.
Suppose then we wish to add a fifth element, $E$, to this set, whilst keeping the other four in order. We can do this in five ways, clearly:
$$EABCD, AEBCD, ABECD, ABCED, ABCDE$$
This will be the case for every permutation of the set of $4$, and will cover every permutation of the set of $5$, without crossovers (the second condition eliminates them)
Thusly, the permutations of a set of $5$ is $4!cdot 5 =5!$
Generalise this and you have a proof.
$endgroup$
add a comment |
$begingroup$
Here's the basis of a proof I might do for this. I'm going to use a numerical argument to make my point clearer, but it will hold for a general $n$ too.
Suppose we assume that the statement holds true for $n=4$ and we use the letters $A,B,C,D$ as our four elements in the set.
One such listing might be $ABCD$.
Suppose then we wish to add a fifth element, $E$, to this set, whilst keeping the other four in order. We can do this in five ways, clearly:
$$EABCD, AEBCD, ABECD, ABCED, ABCDE$$
This will be the case for every permutation of the set of $4$, and will cover every permutation of the set of $5$, without crossovers (the second condition eliminates them)
Thusly, the permutations of a set of $5$ is $4!cdot 5 =5!$
Generalise this and you have a proof.
$endgroup$
add a comment |
$begingroup$
Here's the basis of a proof I might do for this. I'm going to use a numerical argument to make my point clearer, but it will hold for a general $n$ too.
Suppose we assume that the statement holds true for $n=4$ and we use the letters $A,B,C,D$ as our four elements in the set.
One such listing might be $ABCD$.
Suppose then we wish to add a fifth element, $E$, to this set, whilst keeping the other four in order. We can do this in five ways, clearly:
$$EABCD, AEBCD, ABECD, ABCED, ABCDE$$
This will be the case for every permutation of the set of $4$, and will cover every permutation of the set of $5$, without crossovers (the second condition eliminates them)
Thusly, the permutations of a set of $5$ is $4!cdot 5 =5!$
Generalise this and you have a proof.
$endgroup$
Here's the basis of a proof I might do for this. I'm going to use a numerical argument to make my point clearer, but it will hold for a general $n$ too.
Suppose we assume that the statement holds true for $n=4$ and we use the letters $A,B,C,D$ as our four elements in the set.
One such listing might be $ABCD$.
Suppose then we wish to add a fifth element, $E$, to this set, whilst keeping the other four in order. We can do this in five ways, clearly:
$$EABCD, AEBCD, ABECD, ABCED, ABCDE$$
This will be the case for every permutation of the set of $4$, and will cover every permutation of the set of $5$, without crossovers (the second condition eliminates them)
Thusly, the permutations of a set of $5$ is $4!cdot 5 =5!$
Generalise this and you have a proof.
answered Jan 5 at 15:07
Rhys HughesRhys Hughes
7,0951630
7,0951630
add a comment |
add a comment |
$begingroup$
If $X$ is a set, then $S_X$ shall denote the set of all permutations of $X$ (that is, of all bijections from $X$ to $X$). You want to prove that every finite set $X$ satisfies $left|S_Xright| = left|Xright|!$. (Note that this is perfectly true when $X$ is empty.)
If you want a fully rigorous proof written up in all detail, then one place you can find it is Corollary 7.82 in my Notes on the combinatorial fundamentals of algebra, 10th of January 2019. This proof can be summarized as follows:
If $X$ is a set, and if $x$ and $y$ are two elements of $X$, then we let $t_{x, y}$ be the permutation of $X$ that swaps $x$ with $y$ while leaving all other elements of $X$ unchanged. (This is the identity map when $x = y$, and otherwise is a transposition.)
Now let $n$ be a nonnegative integer, and set $left[nright] := left{1,2,ldots,nright}$. Prove (Exercise 5.9 in op. cit.) that each permutation $sigma$ of $left[nright]$ can be written in the form $sigma = t_{1, i_1} circ t_{2, i_2} circ cdots circ t_{n, i_n}$ for a unique $left(i_1, i_2, ldots, i_nright) in left[1right] times left[2right] times cdots times left[nright]$. Roughly speaking, this simply means that $sigma$ can be turned into the identity permutation by first swapping $n$ with its image, then swapping $n-1$ with its image (under the resulting permutation), and so on, until all numbers $n, n-1, ldots, 1$ are in their proper places. I formalize this as an induction argument.
The above statement can be interpreted as a bijection between the set $left[1right] times left[2right] times cdots times left[nright]$ and the set $S_{left[nright]}$ of all permutations of $left[nright]$. The former set clearly has size $n!$; hence, so does the latter. Thus, the set $S_X$ of all permutations of any $n$-element set $X$ has size $n!$ as well (to prove this, fix some bijection between $X$ and $left[nright]$, and use it to construct a bijection from $S_X$ to $S_{left[nright]}$).
This is not the shortest proof. The post by @CopyPasteIt suggests a different argument, but muddles it up somewhat in my opinion, so here is a sketch of how I would do it:
Proceed by induction on $left|Xright|$. The induction base ($left|Xright| = 0$) is obvious. For the induction step, you need to show that if $X$ is a finite set and $x in X$ is arbitrary, then $left|S_Xright| = left|Xright| cdot left|S_{X setminus left{xright} }right|$.
So let us fix a finite set $X$ and some $x in X$. Let $W$ be the set of all permutations $tau in S_X$ satisfying $tauleft(xright) = x$. It is not hard to see that there is a bijection from $S_{X setminus left{xright} }$ to $W$. (Indeed, this follows from Proposition 5.58 (e) in op. cit., applied to $Y = X setminus left{xright}$; but you really should prove this yourself, as it is a straightforward formalization of a clear statement: A permutation of $X$ that fixes $x$ is "really" just a permutation of $X setminus left{xright}$. When you formalize the "really", you get a very natural bijection.) Hence, $left|Wright| = left|S_{X setminus left{xright} }right|$.
If $sigma in S_X$ is any permutation, then there is a unique $y in X$ satisfying $t_{x, y} circ sigma in W$ (namely, this $y$ is $sigmaleft(xright)$). Hence, each $sigma in S_X$ can be written as $t_{x, y} circ w$ for a unique pair $left(y, wright) in X times W$. In other words, the map $X times W to S_X, left(y, wright) mapsto t_{x, y} circ w$ is a bijection. Thus, $left|S_Xright| = left|X times Wright| = left|Xright| cdot left|Wright| = left|Xright| cdot left|S_{X setminus left{xright} }right|$ (since $left|Wright| = left|S_{X setminus left{xright} }right|$). This completes the induction step.
Things get somewhat easier once you get used to combinatorics and realize that the permutations of a finite $n$-element set $X$ are in bijection with the $n$-tuples $left(x_1, x_2, ldots, x_nright)$ of distinct elements of $X$. Indeed, confusingly, the latter $n$-tuples are often called permutations of $X$ as well, although there is in general no canonical bijection between these two kinds of permutations (see Remark 5.4 in op. cit.). This confusion set aside, it is not hard to see using some versions of the product law that the number of $n$-tuples $left(x_1, x_2, ldots, x_nright)$ of distinct elements of an $n$-element set $X$ is $n!$; in fact, more generally, for any nonnegative integer $m$, the number of $m$-tuples $left(x_1, x_2, ldots, x_mright)$ of distinct elements of an $n$-element set $X$ is $nleft(n-1right)left(n-2right)cdotsleft(n-m+1right)$. This gives the perhaps shortest proofs of your claim, having the advantage that you don't really have to deal with permutations (once you have shown that they are in bijection with these tuples).
$endgroup$
add a comment |
$begingroup$
If $X$ is a set, then $S_X$ shall denote the set of all permutations of $X$ (that is, of all bijections from $X$ to $X$). You want to prove that every finite set $X$ satisfies $left|S_Xright| = left|Xright|!$. (Note that this is perfectly true when $X$ is empty.)
If you want a fully rigorous proof written up in all detail, then one place you can find it is Corollary 7.82 in my Notes on the combinatorial fundamentals of algebra, 10th of January 2019. This proof can be summarized as follows:
If $X$ is a set, and if $x$ and $y$ are two elements of $X$, then we let $t_{x, y}$ be the permutation of $X$ that swaps $x$ with $y$ while leaving all other elements of $X$ unchanged. (This is the identity map when $x = y$, and otherwise is a transposition.)
Now let $n$ be a nonnegative integer, and set $left[nright] := left{1,2,ldots,nright}$. Prove (Exercise 5.9 in op. cit.) that each permutation $sigma$ of $left[nright]$ can be written in the form $sigma = t_{1, i_1} circ t_{2, i_2} circ cdots circ t_{n, i_n}$ for a unique $left(i_1, i_2, ldots, i_nright) in left[1right] times left[2right] times cdots times left[nright]$. Roughly speaking, this simply means that $sigma$ can be turned into the identity permutation by first swapping $n$ with its image, then swapping $n-1$ with its image (under the resulting permutation), and so on, until all numbers $n, n-1, ldots, 1$ are in their proper places. I formalize this as an induction argument.
The above statement can be interpreted as a bijection between the set $left[1right] times left[2right] times cdots times left[nright]$ and the set $S_{left[nright]}$ of all permutations of $left[nright]$. The former set clearly has size $n!$; hence, so does the latter. Thus, the set $S_X$ of all permutations of any $n$-element set $X$ has size $n!$ as well (to prove this, fix some bijection between $X$ and $left[nright]$, and use it to construct a bijection from $S_X$ to $S_{left[nright]}$).
This is not the shortest proof. The post by @CopyPasteIt suggests a different argument, but muddles it up somewhat in my opinion, so here is a sketch of how I would do it:
Proceed by induction on $left|Xright|$. The induction base ($left|Xright| = 0$) is obvious. For the induction step, you need to show that if $X$ is a finite set and $x in X$ is arbitrary, then $left|S_Xright| = left|Xright| cdot left|S_{X setminus left{xright} }right|$.
So let us fix a finite set $X$ and some $x in X$. Let $W$ be the set of all permutations $tau in S_X$ satisfying $tauleft(xright) = x$. It is not hard to see that there is a bijection from $S_{X setminus left{xright} }$ to $W$. (Indeed, this follows from Proposition 5.58 (e) in op. cit., applied to $Y = X setminus left{xright}$; but you really should prove this yourself, as it is a straightforward formalization of a clear statement: A permutation of $X$ that fixes $x$ is "really" just a permutation of $X setminus left{xright}$. When you formalize the "really", you get a very natural bijection.) Hence, $left|Wright| = left|S_{X setminus left{xright} }right|$.
If $sigma in S_X$ is any permutation, then there is a unique $y in X$ satisfying $t_{x, y} circ sigma in W$ (namely, this $y$ is $sigmaleft(xright)$). Hence, each $sigma in S_X$ can be written as $t_{x, y} circ w$ for a unique pair $left(y, wright) in X times W$. In other words, the map $X times W to S_X, left(y, wright) mapsto t_{x, y} circ w$ is a bijection. Thus, $left|S_Xright| = left|X times Wright| = left|Xright| cdot left|Wright| = left|Xright| cdot left|S_{X setminus left{xright} }right|$ (since $left|Wright| = left|S_{X setminus left{xright} }right|$). This completes the induction step.
Things get somewhat easier once you get used to combinatorics and realize that the permutations of a finite $n$-element set $X$ are in bijection with the $n$-tuples $left(x_1, x_2, ldots, x_nright)$ of distinct elements of $X$. Indeed, confusingly, the latter $n$-tuples are often called permutations of $X$ as well, although there is in general no canonical bijection between these two kinds of permutations (see Remark 5.4 in op. cit.). This confusion set aside, it is not hard to see using some versions of the product law that the number of $n$-tuples $left(x_1, x_2, ldots, x_nright)$ of distinct elements of an $n$-element set $X$ is $n!$; in fact, more generally, for any nonnegative integer $m$, the number of $m$-tuples $left(x_1, x_2, ldots, x_mright)$ of distinct elements of an $n$-element set $X$ is $nleft(n-1right)left(n-2right)cdotsleft(n-m+1right)$. This gives the perhaps shortest proofs of your claim, having the advantage that you don't really have to deal with permutations (once you have shown that they are in bijection with these tuples).
$endgroup$
add a comment |
$begingroup$
If $X$ is a set, then $S_X$ shall denote the set of all permutations of $X$ (that is, of all bijections from $X$ to $X$). You want to prove that every finite set $X$ satisfies $left|S_Xright| = left|Xright|!$. (Note that this is perfectly true when $X$ is empty.)
If you want a fully rigorous proof written up in all detail, then one place you can find it is Corollary 7.82 in my Notes on the combinatorial fundamentals of algebra, 10th of January 2019. This proof can be summarized as follows:
If $X$ is a set, and if $x$ and $y$ are two elements of $X$, then we let $t_{x, y}$ be the permutation of $X$ that swaps $x$ with $y$ while leaving all other elements of $X$ unchanged. (This is the identity map when $x = y$, and otherwise is a transposition.)
Now let $n$ be a nonnegative integer, and set $left[nright] := left{1,2,ldots,nright}$. Prove (Exercise 5.9 in op. cit.) that each permutation $sigma$ of $left[nright]$ can be written in the form $sigma = t_{1, i_1} circ t_{2, i_2} circ cdots circ t_{n, i_n}$ for a unique $left(i_1, i_2, ldots, i_nright) in left[1right] times left[2right] times cdots times left[nright]$. Roughly speaking, this simply means that $sigma$ can be turned into the identity permutation by first swapping $n$ with its image, then swapping $n-1$ with its image (under the resulting permutation), and so on, until all numbers $n, n-1, ldots, 1$ are in their proper places. I formalize this as an induction argument.
The above statement can be interpreted as a bijection between the set $left[1right] times left[2right] times cdots times left[nright]$ and the set $S_{left[nright]}$ of all permutations of $left[nright]$. The former set clearly has size $n!$; hence, so does the latter. Thus, the set $S_X$ of all permutations of any $n$-element set $X$ has size $n!$ as well (to prove this, fix some bijection between $X$ and $left[nright]$, and use it to construct a bijection from $S_X$ to $S_{left[nright]}$).
This is not the shortest proof. The post by @CopyPasteIt suggests a different argument, but muddles it up somewhat in my opinion, so here is a sketch of how I would do it:
Proceed by induction on $left|Xright|$. The induction base ($left|Xright| = 0$) is obvious. For the induction step, you need to show that if $X$ is a finite set and $x in X$ is arbitrary, then $left|S_Xright| = left|Xright| cdot left|S_{X setminus left{xright} }right|$.
So let us fix a finite set $X$ and some $x in X$. Let $W$ be the set of all permutations $tau in S_X$ satisfying $tauleft(xright) = x$. It is not hard to see that there is a bijection from $S_{X setminus left{xright} }$ to $W$. (Indeed, this follows from Proposition 5.58 (e) in op. cit., applied to $Y = X setminus left{xright}$; but you really should prove this yourself, as it is a straightforward formalization of a clear statement: A permutation of $X$ that fixes $x$ is "really" just a permutation of $X setminus left{xright}$. When you formalize the "really", you get a very natural bijection.) Hence, $left|Wright| = left|S_{X setminus left{xright} }right|$.
If $sigma in S_X$ is any permutation, then there is a unique $y in X$ satisfying $t_{x, y} circ sigma in W$ (namely, this $y$ is $sigmaleft(xright)$). Hence, each $sigma in S_X$ can be written as $t_{x, y} circ w$ for a unique pair $left(y, wright) in X times W$. In other words, the map $X times W to S_X, left(y, wright) mapsto t_{x, y} circ w$ is a bijection. Thus, $left|S_Xright| = left|X times Wright| = left|Xright| cdot left|Wright| = left|Xright| cdot left|S_{X setminus left{xright} }right|$ (since $left|Wright| = left|S_{X setminus left{xright} }right|$). This completes the induction step.
Things get somewhat easier once you get used to combinatorics and realize that the permutations of a finite $n$-element set $X$ are in bijection with the $n$-tuples $left(x_1, x_2, ldots, x_nright)$ of distinct elements of $X$. Indeed, confusingly, the latter $n$-tuples are often called permutations of $X$ as well, although there is in general no canonical bijection between these two kinds of permutations (see Remark 5.4 in op. cit.). This confusion set aside, it is not hard to see using some versions of the product law that the number of $n$-tuples $left(x_1, x_2, ldots, x_nright)$ of distinct elements of an $n$-element set $X$ is $n!$; in fact, more generally, for any nonnegative integer $m$, the number of $m$-tuples $left(x_1, x_2, ldots, x_mright)$ of distinct elements of an $n$-element set $X$ is $nleft(n-1right)left(n-2right)cdotsleft(n-m+1right)$. This gives the perhaps shortest proofs of your claim, having the advantage that you don't really have to deal with permutations (once you have shown that they are in bijection with these tuples).
$endgroup$
If $X$ is a set, then $S_X$ shall denote the set of all permutations of $X$ (that is, of all bijections from $X$ to $X$). You want to prove that every finite set $X$ satisfies $left|S_Xright| = left|Xright|!$. (Note that this is perfectly true when $X$ is empty.)
If you want a fully rigorous proof written up in all detail, then one place you can find it is Corollary 7.82 in my Notes on the combinatorial fundamentals of algebra, 10th of January 2019. This proof can be summarized as follows:
If $X$ is a set, and if $x$ and $y$ are two elements of $X$, then we let $t_{x, y}$ be the permutation of $X$ that swaps $x$ with $y$ while leaving all other elements of $X$ unchanged. (This is the identity map when $x = y$, and otherwise is a transposition.)
Now let $n$ be a nonnegative integer, and set $left[nright] := left{1,2,ldots,nright}$. Prove (Exercise 5.9 in op. cit.) that each permutation $sigma$ of $left[nright]$ can be written in the form $sigma = t_{1, i_1} circ t_{2, i_2} circ cdots circ t_{n, i_n}$ for a unique $left(i_1, i_2, ldots, i_nright) in left[1right] times left[2right] times cdots times left[nright]$. Roughly speaking, this simply means that $sigma$ can be turned into the identity permutation by first swapping $n$ with its image, then swapping $n-1$ with its image (under the resulting permutation), and so on, until all numbers $n, n-1, ldots, 1$ are in their proper places. I formalize this as an induction argument.
The above statement can be interpreted as a bijection between the set $left[1right] times left[2right] times cdots times left[nright]$ and the set $S_{left[nright]}$ of all permutations of $left[nright]$. The former set clearly has size $n!$; hence, so does the latter. Thus, the set $S_X$ of all permutations of any $n$-element set $X$ has size $n!$ as well (to prove this, fix some bijection between $X$ and $left[nright]$, and use it to construct a bijection from $S_X$ to $S_{left[nright]}$).
This is not the shortest proof. The post by @CopyPasteIt suggests a different argument, but muddles it up somewhat in my opinion, so here is a sketch of how I would do it:
Proceed by induction on $left|Xright|$. The induction base ($left|Xright| = 0$) is obvious. For the induction step, you need to show that if $X$ is a finite set and $x in X$ is arbitrary, then $left|S_Xright| = left|Xright| cdot left|S_{X setminus left{xright} }right|$.
So let us fix a finite set $X$ and some $x in X$. Let $W$ be the set of all permutations $tau in S_X$ satisfying $tauleft(xright) = x$. It is not hard to see that there is a bijection from $S_{X setminus left{xright} }$ to $W$. (Indeed, this follows from Proposition 5.58 (e) in op. cit., applied to $Y = X setminus left{xright}$; but you really should prove this yourself, as it is a straightforward formalization of a clear statement: A permutation of $X$ that fixes $x$ is "really" just a permutation of $X setminus left{xright}$. When you formalize the "really", you get a very natural bijection.) Hence, $left|Wright| = left|S_{X setminus left{xright} }right|$.
If $sigma in S_X$ is any permutation, then there is a unique $y in X$ satisfying $t_{x, y} circ sigma in W$ (namely, this $y$ is $sigmaleft(xright)$). Hence, each $sigma in S_X$ can be written as $t_{x, y} circ w$ for a unique pair $left(y, wright) in X times W$. In other words, the map $X times W to S_X, left(y, wright) mapsto t_{x, y} circ w$ is a bijection. Thus, $left|S_Xright| = left|X times Wright| = left|Xright| cdot left|Wright| = left|Xright| cdot left|S_{X setminus left{xright} }right|$ (since $left|Wright| = left|S_{X setminus left{xright} }right|$). This completes the induction step.
Things get somewhat easier once you get used to combinatorics and realize that the permutations of a finite $n$-element set $X$ are in bijection with the $n$-tuples $left(x_1, x_2, ldots, x_nright)$ of distinct elements of $X$. Indeed, confusingly, the latter $n$-tuples are often called permutations of $X$ as well, although there is in general no canonical bijection between these two kinds of permutations (see Remark 5.4 in op. cit.). This confusion set aside, it is not hard to see using some versions of the product law that the number of $n$-tuples $left(x_1, x_2, ldots, x_nright)$ of distinct elements of an $n$-element set $X$ is $n!$; in fact, more generally, for any nonnegative integer $m$, the number of $m$-tuples $left(x_1, x_2, ldots, x_mright)$ of distinct elements of an $n$-element set $X$ is $nleft(n-1right)left(n-2right)cdotsleft(n-m+1right)$. This gives the perhaps shortest proofs of your claim, having the advantage that you don't really have to deal with permutations (once you have shown that they are in bijection with these tuples).
edited Jan 13 at 14:54
answered Dec 31 '18 at 16:51
darij grinbergdarij grinberg
11.5k33168
11.5k33168
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3055033%2fproof-for-any-integer-n-with-n-ge-1-the-number-of-permutations-of-a-set-w%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
I think you should change the 4th line to "Now assume that $P(m)$ is true for some $mgeq 1$. (The word some is important because you don't assume for all $m$, also it's greater or equal 1 not 2, you still don't know about $m=2$).
$endgroup$
– Yanko
Dec 28 '18 at 16:21
1
$begingroup$
Careful with your notation: you use $P$ to denote the statement you want to prove, but then you write $P(1)=1!$
$endgroup$
– Sambo
Dec 28 '18 at 16:23
$begingroup$
@Yanko Thanks for the comment. But we already know that it’s true for $m =1$, don’t we?
$endgroup$
– The Pointer
Dec 28 '18 at 16:27
$begingroup$
@ThePointer yes that's why we can assume the claim holds for some $mgeq 1$. (in other words you also need to prove the implication $P(1)implies P(2)$.)
$endgroup$
– Yanko
Dec 28 '18 at 16:29
1
$begingroup$
I can also explain @Sambo 's comment. I believe he is talking about the second line. Instead of writing $P(1)=1!$ you should say "$P(1)$ holds because the number of permutations of $1$ element is of size $1$ and $1=1!$". His point is that $P(1)$ is a statement not a number.
$endgroup$
– Yanko
Dec 28 '18 at 16:31