It depends. After clicking each link does the same page open that contains different data? Or is it a different pages each time?
If the links open the same page that contains different data, then I would treat all the links as a collection of clones. This can be achieved by generalizing your match rules so that all of your links are matched at the same time. Then, you should be able to get a collection of clones, extract a proxy and pass it into a loop.
In the for loop you can carry out whatever tasks you need to for each link.
If the number of pages that you get from clicking on links is to great to manage efficiently with the method above, you may have to interrogate each of your links and pages and build your automation to execute linearly.