I was wondering if anyone knows what kind of methods are the most efficient to implement in very intensive searches (arrays) (I speak of arrays of millions of elements) from C #, for example it is better to use IndexOf or BinarySearch to obtain the index of the item? Should I use arrays or HashSet? and how could I use the HashSet to find the matches in the array?
A logical example of the algorithm I need:
Block: 071827490593720123213023498230402000813
Value to find: 40200
Objective: Return the index where said group of numbers is
What you should return: 30
Code that I currently implement:
int offSet = 0;
while ((offSet = Array.IndexOf(bloque, encontrar[0], offSet)) != -1)
{
if (encontrar.Length > 1)
for (int i = 1; i < encontrar.Length; i++)
{
if (bloque.Length <= offSet + encontrar.Length) break;
else if (encontrar[i] != bloque[offSet + i])
{
if (bloque[offSet + i] == encontrar[0])
{ offSet += (i - 1); break; }
else if (i == encontrar.Length - 1)
{ offSet += i; break; }
break;
}
else if (i == encontrar.Length - 1)
addresses.Add(new IntPtr((int)baseAddress + offSet));
}
else addresses.Add(new IntPtr((int)baseAddress + offSet));
offSet++;
}
This algorithm is not slow, but it is not as fast looking as the program with which I am comparing it. The program that I am developing opens the processes and looks for values in its memory regions (Yes, I am comparing it with Cheat Engine). As you can see it is more or less similar to the Boyer Moore algorithm, but I need to know if I can replace functions to increase performance or if I should remove or change something in the logic of the algorithm to increase performance.