C

#### Canleskis

##### Guest

*Peculiar performance variance with small arrays*

as part of a larger project (of N-body simulations) I started working with arrays using const generics to represent algebraic vectors. Those vectors can then be of any size, but I'm especially interested in the ones of length 2 and 3.

I noticed that performance with arrays of length 2 was worse (a lot worse!) compared to similarly sized arrays (length 1 or 3 for example) in a certain scenario I need for my project.

This is the scenario, or rather a shortened version of it:

Code:

```
fn fold<const N: usize>(v: Vec<Array<N>>) -> Vec<Array<N>> {
let result = v.iter().map(|a1| {
v.iter().fold(Array::default(), |acc, a2| {
let d = *a2 - *a1;
acc + d
})
});
result.collect()
}
```

The arrays are wrapped in a newtype called

`Array`

implementing operators for simplicity, but the results are the same without that wrapper.Using criterion with Vecs lengthed from 10 to 500 of random arrays, I benched this function and generated the following graph for differently sized arrays:

As you can see performance is a lot worse for arrays of length 2 compared to the rest. I'm unable to understand why this specific scenario leads to such performance variance, so I'm hoping someone can help me figure this out because this has slowed down my progress significantly.

SolveForum.com may not be responsible for the answers or solutions given to any question asked by the users. All Answers or responses are user generated answers and we do not have proof of its validity or correctness. Please vote for the answer that helped you in order to help others find out which is the most helpful answer. Questions labeled as solved may be solved or may not be solved depending on the type of question and the date posted for some posts may be scheduled to be deleted periodically. Do not hesitate to share your thoughts here to help others.