What's the difference between finding the average Euclidean distance and using inertia_ in KMeans in sklearn?












4












$begingroup$


I've found two different approaches online when using the Elbow Method to determine the optimal number of clusters for K-Means.



One approach is to use the following code:



distortions_2.append(sum(np.min(cdist(data,
kmeanModel.cluster_centers_,
'euclidean'),
axis = 1)) / data.shape[0])


enter image description here



Another is to use inertia_ from sklearn.cluster.KMeans:



distortions_3.append(kmeanModel.inertia_)


enter image description here



When I plot the results (using the same random states) both give different results but I'm not sure what the differences are, can anyone help?



Edit: If I replace the normalisation factor / data.shape[0] with squared **2 as suggested below, then I still don't get the same as for the inertia plot:



distortions_2.append(sum(np.min(cdist(data, 
kmeanModel.cluster_centers_,
'euclidean'),
axis = 1)) ** 2)


Using squared just makes the plot a little smoother, but definitely not the same as using intertia_, I'm just not quite sure how inertia_ is calculated and what it means.










share|improve this question











$endgroup$








  • 2




    $begingroup$
    Please do not post duplicates! stackoverflow.com/q/50467835/1060350
    $endgroup$
    – Anony-Mousse
    May 23 '18 at 5:07










  • $begingroup$
    Apologies! I've requested a merge
    $endgroup$
    – lstodd
    May 23 '18 at 9:48










  • $begingroup$
    Show ALL your code!!! Show the plot you got for the corrected equation. Come on man, you need to help yourself for us to help you.
    $endgroup$
    – JahKnows
    May 25 '18 at 0:52
















4












$begingroup$


I've found two different approaches online when using the Elbow Method to determine the optimal number of clusters for K-Means.



One approach is to use the following code:



distortions_2.append(sum(np.min(cdist(data,
kmeanModel.cluster_centers_,
'euclidean'),
axis = 1)) / data.shape[0])


enter image description here



Another is to use inertia_ from sklearn.cluster.KMeans:



distortions_3.append(kmeanModel.inertia_)


enter image description here



When I plot the results (using the same random states) both give different results but I'm not sure what the differences are, can anyone help?



Edit: If I replace the normalisation factor / data.shape[0] with squared **2 as suggested below, then I still don't get the same as for the inertia plot:



distortions_2.append(sum(np.min(cdist(data, 
kmeanModel.cluster_centers_,
'euclidean'),
axis = 1)) ** 2)


Using squared just makes the plot a little smoother, but definitely not the same as using intertia_, I'm just not quite sure how inertia_ is calculated and what it means.










share|improve this question











$endgroup$








  • 2




    $begingroup$
    Please do not post duplicates! stackoverflow.com/q/50467835/1060350
    $endgroup$
    – Anony-Mousse
    May 23 '18 at 5:07










  • $begingroup$
    Apologies! I've requested a merge
    $endgroup$
    – lstodd
    May 23 '18 at 9:48










  • $begingroup$
    Show ALL your code!!! Show the plot you got for the corrected equation. Come on man, you need to help yourself for us to help you.
    $endgroup$
    – JahKnows
    May 25 '18 at 0:52














4












4








4


0



$begingroup$


I've found two different approaches online when using the Elbow Method to determine the optimal number of clusters for K-Means.



One approach is to use the following code:



distortions_2.append(sum(np.min(cdist(data,
kmeanModel.cluster_centers_,
'euclidean'),
axis = 1)) / data.shape[0])


enter image description here



Another is to use inertia_ from sklearn.cluster.KMeans:



distortions_3.append(kmeanModel.inertia_)


enter image description here



When I plot the results (using the same random states) both give different results but I'm not sure what the differences are, can anyone help?



Edit: If I replace the normalisation factor / data.shape[0] with squared **2 as suggested below, then I still don't get the same as for the inertia plot:



distortions_2.append(sum(np.min(cdist(data, 
kmeanModel.cluster_centers_,
'euclidean'),
axis = 1)) ** 2)


Using squared just makes the plot a little smoother, but definitely not the same as using intertia_, I'm just not quite sure how inertia_ is calculated and what it means.










share|improve this question











$endgroup$




I've found two different approaches online when using the Elbow Method to determine the optimal number of clusters for K-Means.



One approach is to use the following code:



distortions_2.append(sum(np.min(cdist(data,
kmeanModel.cluster_centers_,
'euclidean'),
axis = 1)) / data.shape[0])


enter image description here



Another is to use inertia_ from sklearn.cluster.KMeans:



distortions_3.append(kmeanModel.inertia_)


enter image description here



When I plot the results (using the same random states) both give different results but I'm not sure what the differences are, can anyone help?



Edit: If I replace the normalisation factor / data.shape[0] with squared **2 as suggested below, then I still don't get the same as for the inertia plot:



distortions_2.append(sum(np.min(cdist(data, 
kmeanModel.cluster_centers_,
'euclidean'),
axis = 1)) ** 2)


Using squared just makes the plot a little smoother, but definitely not the same as using intertia_, I'm just not quite sure how inertia_ is calculated and what it means.







python scikit-learn clustering k-means






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited 5 mins ago







lstodd

















asked May 22 '18 at 14:21









lstoddlstodd

292




292








  • 2




    $begingroup$
    Please do not post duplicates! stackoverflow.com/q/50467835/1060350
    $endgroup$
    – Anony-Mousse
    May 23 '18 at 5:07










  • $begingroup$
    Apologies! I've requested a merge
    $endgroup$
    – lstodd
    May 23 '18 at 9:48










  • $begingroup$
    Show ALL your code!!! Show the plot you got for the corrected equation. Come on man, you need to help yourself for us to help you.
    $endgroup$
    – JahKnows
    May 25 '18 at 0:52














  • 2




    $begingroup$
    Please do not post duplicates! stackoverflow.com/q/50467835/1060350
    $endgroup$
    – Anony-Mousse
    May 23 '18 at 5:07










  • $begingroup$
    Apologies! I've requested a merge
    $endgroup$
    – lstodd
    May 23 '18 at 9:48










  • $begingroup$
    Show ALL your code!!! Show the plot you got for the corrected equation. Come on man, you need to help yourself for us to help you.
    $endgroup$
    – JahKnows
    May 25 '18 at 0:52








2




2




$begingroup$
Please do not post duplicates! stackoverflow.com/q/50467835/1060350
$endgroup$
– Anony-Mousse
May 23 '18 at 5:07




$begingroup$
Please do not post duplicates! stackoverflow.com/q/50467835/1060350
$endgroup$
– Anony-Mousse
May 23 '18 at 5:07












$begingroup$
Apologies! I've requested a merge
$endgroup$
– lstodd
May 23 '18 at 9:48




$begingroup$
Apologies! I've requested a merge
$endgroup$
– lstodd
May 23 '18 at 9:48












$begingroup$
Show ALL your code!!! Show the plot you got for the corrected equation. Come on man, you need to help yourself for us to help you.
$endgroup$
– JahKnows
May 25 '18 at 0:52




$begingroup$
Show ALL your code!!! Show the plot you got for the corrected equation. Come on man, you need to help yourself for us to help you.
$endgroup$
– JahKnows
May 25 '18 at 0:52










1 Answer
1






active

oldest

votes


















1












$begingroup$

You will notice that the inertia is the sum of squared distance between each point and its nearest cluster center. Thus by removing the normalization term / data.shape[0] and instead replacing that by a squared **2 these two expressions will be equivalent



distortions_2.append(sum(np.min(cdist(data
, kmeanModel.cluster_centers_
, 'euclidean')
, axis = 1)) **2)


and



distortions_2.append(kmeanModel.inertia_)




For example using this artificial data source, where we generate 5 Gaussian distributions with 300 points in each.



params = [[[ 0,1],  [ 0,1]], 
[[ 5,1], [ 5,1]],
[[-2,5], [ 2,5]],
[[ 2,1], [ 2,1]],
[[-5,1], [-5,1]]]

n = 300
dims = len(params[0])

data =
y =
for ix, i in enumerate(params):
inst = np.random.randn(n, dims)
for dim in range(dims):
inst[:,dim] = params[ix][dim][0]+params[ix][dim][1]*inst[:,dim]
label = ix + np.zeros(n)

if len(data) == 0: data = inst
else: data = np.append( data, inst, axis= 0)
if len(y) == 0: y = label
else: y = np.append(y, label)

num_clusters = len(params)


enter image description here



We will then apply KMeans using different number of clusters



k = [1,2,3,4,5,6,7,8,9,10]

inertias =
dists =

for i in k:
kmeans = KMeans(i)
kmeans.fit(data)
inertias.append(kmeans.inertia_)
dists.append(sum(np.min(spatial.distance.cdist(data, kmeans.cluster_centers_, 'euclidean'), axis=1)**2))

plt.plot(range(1, len(inertias)+1), inertias, label = 'Inertia')
plt.plot(range(1, len(dists)+1), dists, label = 'Distance')
plt.legend()
plt.show()


enter image description here






share|improve this answer











$endgroup$













  • $begingroup$
    Thanks for your response! I've tried plotting the same graph but using squared instead of normalizing as above, but I get the same shaped graph just a bit more smooth. It's totally different from the plot I get using inertia_ from KMeans so I still think I'm missing something
    $endgroup$
    – lstodd
    May 23 '18 at 10:48










  • $begingroup$
    Copy my code exactly as you see it. Or update your question with the entirety of your code and I'll correct it.
    $endgroup$
    – JahKnows
    May 23 '18 at 10:49










  • $begingroup$
    I have edited the question above, the formula I've used and inertia_ are different as the resultant plots are quite different
    $endgroup$
    – lstodd
    May 24 '18 at 9:35










  • $begingroup$
    @Istodd, can you show me your whole code please? What is distortions_2 and distortions_3, these must be the same. Just paste your code please.
    $endgroup$
    – JahKnows
    May 24 '18 at 9:48











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "557"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f31989%2fwhats-the-difference-between-finding-the-average-euclidean-distance-and-using-i%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









1












$begingroup$

You will notice that the inertia is the sum of squared distance between each point and its nearest cluster center. Thus by removing the normalization term / data.shape[0] and instead replacing that by a squared **2 these two expressions will be equivalent



distortions_2.append(sum(np.min(cdist(data
, kmeanModel.cluster_centers_
, 'euclidean')
, axis = 1)) **2)


and



distortions_2.append(kmeanModel.inertia_)




For example using this artificial data source, where we generate 5 Gaussian distributions with 300 points in each.



params = [[[ 0,1],  [ 0,1]], 
[[ 5,1], [ 5,1]],
[[-2,5], [ 2,5]],
[[ 2,1], [ 2,1]],
[[-5,1], [-5,1]]]

n = 300
dims = len(params[0])

data =
y =
for ix, i in enumerate(params):
inst = np.random.randn(n, dims)
for dim in range(dims):
inst[:,dim] = params[ix][dim][0]+params[ix][dim][1]*inst[:,dim]
label = ix + np.zeros(n)

if len(data) == 0: data = inst
else: data = np.append( data, inst, axis= 0)
if len(y) == 0: y = label
else: y = np.append(y, label)

num_clusters = len(params)


enter image description here



We will then apply KMeans using different number of clusters



k = [1,2,3,4,5,6,7,8,9,10]

inertias =
dists =

for i in k:
kmeans = KMeans(i)
kmeans.fit(data)
inertias.append(kmeans.inertia_)
dists.append(sum(np.min(spatial.distance.cdist(data, kmeans.cluster_centers_, 'euclidean'), axis=1)**2))

plt.plot(range(1, len(inertias)+1), inertias, label = 'Inertia')
plt.plot(range(1, len(dists)+1), dists, label = 'Distance')
plt.legend()
plt.show()


enter image description here






share|improve this answer











$endgroup$













  • $begingroup$
    Thanks for your response! I've tried plotting the same graph but using squared instead of normalizing as above, but I get the same shaped graph just a bit more smooth. It's totally different from the plot I get using inertia_ from KMeans so I still think I'm missing something
    $endgroup$
    – lstodd
    May 23 '18 at 10:48










  • $begingroup$
    Copy my code exactly as you see it. Or update your question with the entirety of your code and I'll correct it.
    $endgroup$
    – JahKnows
    May 23 '18 at 10:49










  • $begingroup$
    I have edited the question above, the formula I've used and inertia_ are different as the resultant plots are quite different
    $endgroup$
    – lstodd
    May 24 '18 at 9:35










  • $begingroup$
    @Istodd, can you show me your whole code please? What is distortions_2 and distortions_3, these must be the same. Just paste your code please.
    $endgroup$
    – JahKnows
    May 24 '18 at 9:48
















1












$begingroup$

You will notice that the inertia is the sum of squared distance between each point and its nearest cluster center. Thus by removing the normalization term / data.shape[0] and instead replacing that by a squared **2 these two expressions will be equivalent



distortions_2.append(sum(np.min(cdist(data
, kmeanModel.cluster_centers_
, 'euclidean')
, axis = 1)) **2)


and



distortions_2.append(kmeanModel.inertia_)




For example using this artificial data source, where we generate 5 Gaussian distributions with 300 points in each.



params = [[[ 0,1],  [ 0,1]], 
[[ 5,1], [ 5,1]],
[[-2,5], [ 2,5]],
[[ 2,1], [ 2,1]],
[[-5,1], [-5,1]]]

n = 300
dims = len(params[0])

data =
y =
for ix, i in enumerate(params):
inst = np.random.randn(n, dims)
for dim in range(dims):
inst[:,dim] = params[ix][dim][0]+params[ix][dim][1]*inst[:,dim]
label = ix + np.zeros(n)

if len(data) == 0: data = inst
else: data = np.append( data, inst, axis= 0)
if len(y) == 0: y = label
else: y = np.append(y, label)

num_clusters = len(params)


enter image description here



We will then apply KMeans using different number of clusters



k = [1,2,3,4,5,6,7,8,9,10]

inertias =
dists =

for i in k:
kmeans = KMeans(i)
kmeans.fit(data)
inertias.append(kmeans.inertia_)
dists.append(sum(np.min(spatial.distance.cdist(data, kmeans.cluster_centers_, 'euclidean'), axis=1)**2))

plt.plot(range(1, len(inertias)+1), inertias, label = 'Inertia')
plt.plot(range(1, len(dists)+1), dists, label = 'Distance')
plt.legend()
plt.show()


enter image description here






share|improve this answer











$endgroup$













  • $begingroup$
    Thanks for your response! I've tried plotting the same graph but using squared instead of normalizing as above, but I get the same shaped graph just a bit more smooth. It's totally different from the plot I get using inertia_ from KMeans so I still think I'm missing something
    $endgroup$
    – lstodd
    May 23 '18 at 10:48










  • $begingroup$
    Copy my code exactly as you see it. Or update your question with the entirety of your code and I'll correct it.
    $endgroup$
    – JahKnows
    May 23 '18 at 10:49










  • $begingroup$
    I have edited the question above, the formula I've used and inertia_ are different as the resultant plots are quite different
    $endgroup$
    – lstodd
    May 24 '18 at 9:35










  • $begingroup$
    @Istodd, can you show me your whole code please? What is distortions_2 and distortions_3, these must be the same. Just paste your code please.
    $endgroup$
    – JahKnows
    May 24 '18 at 9:48














1












1








1





$begingroup$

You will notice that the inertia is the sum of squared distance between each point and its nearest cluster center. Thus by removing the normalization term / data.shape[0] and instead replacing that by a squared **2 these two expressions will be equivalent



distortions_2.append(sum(np.min(cdist(data
, kmeanModel.cluster_centers_
, 'euclidean')
, axis = 1)) **2)


and



distortions_2.append(kmeanModel.inertia_)




For example using this artificial data source, where we generate 5 Gaussian distributions with 300 points in each.



params = [[[ 0,1],  [ 0,1]], 
[[ 5,1], [ 5,1]],
[[-2,5], [ 2,5]],
[[ 2,1], [ 2,1]],
[[-5,1], [-5,1]]]

n = 300
dims = len(params[0])

data =
y =
for ix, i in enumerate(params):
inst = np.random.randn(n, dims)
for dim in range(dims):
inst[:,dim] = params[ix][dim][0]+params[ix][dim][1]*inst[:,dim]
label = ix + np.zeros(n)

if len(data) == 0: data = inst
else: data = np.append( data, inst, axis= 0)
if len(y) == 0: y = label
else: y = np.append(y, label)

num_clusters = len(params)


enter image description here



We will then apply KMeans using different number of clusters



k = [1,2,3,4,5,6,7,8,9,10]

inertias =
dists =

for i in k:
kmeans = KMeans(i)
kmeans.fit(data)
inertias.append(kmeans.inertia_)
dists.append(sum(np.min(spatial.distance.cdist(data, kmeans.cluster_centers_, 'euclidean'), axis=1)**2))

plt.plot(range(1, len(inertias)+1), inertias, label = 'Inertia')
plt.plot(range(1, len(dists)+1), dists, label = 'Distance')
plt.legend()
plt.show()


enter image description here






share|improve this answer











$endgroup$



You will notice that the inertia is the sum of squared distance between each point and its nearest cluster center. Thus by removing the normalization term / data.shape[0] and instead replacing that by a squared **2 these two expressions will be equivalent



distortions_2.append(sum(np.min(cdist(data
, kmeanModel.cluster_centers_
, 'euclidean')
, axis = 1)) **2)


and



distortions_2.append(kmeanModel.inertia_)




For example using this artificial data source, where we generate 5 Gaussian distributions with 300 points in each.



params = [[[ 0,1],  [ 0,1]], 
[[ 5,1], [ 5,1]],
[[-2,5], [ 2,5]],
[[ 2,1], [ 2,1]],
[[-5,1], [-5,1]]]

n = 300
dims = len(params[0])

data =
y =
for ix, i in enumerate(params):
inst = np.random.randn(n, dims)
for dim in range(dims):
inst[:,dim] = params[ix][dim][0]+params[ix][dim][1]*inst[:,dim]
label = ix + np.zeros(n)

if len(data) == 0: data = inst
else: data = np.append( data, inst, axis= 0)
if len(y) == 0: y = label
else: y = np.append(y, label)

num_clusters = len(params)


enter image description here



We will then apply KMeans using different number of clusters



k = [1,2,3,4,5,6,7,8,9,10]

inertias =
dists =

for i in k:
kmeans = KMeans(i)
kmeans.fit(data)
inertias.append(kmeans.inertia_)
dists.append(sum(np.min(spatial.distance.cdist(data, kmeans.cluster_centers_, 'euclidean'), axis=1)**2))

plt.plot(range(1, len(inertias)+1), inertias, label = 'Inertia')
plt.plot(range(1, len(dists)+1), dists, label = 'Distance')
plt.legend()
plt.show()


enter image description here







share|improve this answer














share|improve this answer



share|improve this answer








edited May 25 '18 at 0:50









lstodd

292




292










answered May 23 '18 at 2:51









JahKnowsJahKnows

4,512524




4,512524












  • $begingroup$
    Thanks for your response! I've tried plotting the same graph but using squared instead of normalizing as above, but I get the same shaped graph just a bit more smooth. It's totally different from the plot I get using inertia_ from KMeans so I still think I'm missing something
    $endgroup$
    – lstodd
    May 23 '18 at 10:48










  • $begingroup$
    Copy my code exactly as you see it. Or update your question with the entirety of your code and I'll correct it.
    $endgroup$
    – JahKnows
    May 23 '18 at 10:49










  • $begingroup$
    I have edited the question above, the formula I've used and inertia_ are different as the resultant plots are quite different
    $endgroup$
    – lstodd
    May 24 '18 at 9:35










  • $begingroup$
    @Istodd, can you show me your whole code please? What is distortions_2 and distortions_3, these must be the same. Just paste your code please.
    $endgroup$
    – JahKnows
    May 24 '18 at 9:48


















  • $begingroup$
    Thanks for your response! I've tried plotting the same graph but using squared instead of normalizing as above, but I get the same shaped graph just a bit more smooth. It's totally different from the plot I get using inertia_ from KMeans so I still think I'm missing something
    $endgroup$
    – lstodd
    May 23 '18 at 10:48










  • $begingroup$
    Copy my code exactly as you see it. Or update your question with the entirety of your code and I'll correct it.
    $endgroup$
    – JahKnows
    May 23 '18 at 10:49










  • $begingroup$
    I have edited the question above, the formula I've used and inertia_ are different as the resultant plots are quite different
    $endgroup$
    – lstodd
    May 24 '18 at 9:35










  • $begingroup$
    @Istodd, can you show me your whole code please? What is distortions_2 and distortions_3, these must be the same. Just paste your code please.
    $endgroup$
    – JahKnows
    May 24 '18 at 9:48
















$begingroup$
Thanks for your response! I've tried plotting the same graph but using squared instead of normalizing as above, but I get the same shaped graph just a bit more smooth. It's totally different from the plot I get using inertia_ from KMeans so I still think I'm missing something
$endgroup$
– lstodd
May 23 '18 at 10:48




$begingroup$
Thanks for your response! I've tried plotting the same graph but using squared instead of normalizing as above, but I get the same shaped graph just a bit more smooth. It's totally different from the plot I get using inertia_ from KMeans so I still think I'm missing something
$endgroup$
– lstodd
May 23 '18 at 10:48












$begingroup$
Copy my code exactly as you see it. Or update your question with the entirety of your code and I'll correct it.
$endgroup$
– JahKnows
May 23 '18 at 10:49




$begingroup$
Copy my code exactly as you see it. Or update your question with the entirety of your code and I'll correct it.
$endgroup$
– JahKnows
May 23 '18 at 10:49












$begingroup$
I have edited the question above, the formula I've used and inertia_ are different as the resultant plots are quite different
$endgroup$
– lstodd
May 24 '18 at 9:35




$begingroup$
I have edited the question above, the formula I've used and inertia_ are different as the resultant plots are quite different
$endgroup$
– lstodd
May 24 '18 at 9:35












$begingroup$
@Istodd, can you show me your whole code please? What is distortions_2 and distortions_3, these must be the same. Just paste your code please.
$endgroup$
– JahKnows
May 24 '18 at 9:48




$begingroup$
@Istodd, can you show me your whole code please? What is distortions_2 and distortions_3, these must be the same. Just paste your code please.
$endgroup$
– JahKnows
May 24 '18 at 9:48


















draft saved

draft discarded




















































Thanks for contributing an answer to Data Science Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f31989%2fwhats-the-difference-between-finding-the-average-euclidean-distance-and-using-i%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Ponta tanko

Tantalo (mitologio)

Erzsébet Schaár