Retryable pattern for file processing in java -


I need to process a large file (with columns and the same format lines) because I programmed during processing Crashing cases need to be considered, I need this processing program to try again, which means that after crashing and I start the program again,

Can I follow any pattern Interpretation can I use my or library? Thanks!


Update:

Regarding crashes, it is not about OOM or just some internal issues, due to expiry of time with other parts or machine crashes. It can also happen. So try / hold it can not handle it.


Another update:

About the fragment of the file, it is possible in my case, but it is not as simple as that sounds. As I said, the file is formatted with multiple columns and I can divide it into hundreds of files based on one column and then process files one by one. But instead of doing so, I would like to know more about the general solution about processing large files / data to try again.

You can maintain a checkpoint / commit style logic in your code, so when the program runs again So it starts with the same checkpoint.

You can use RandomAccessFile to read the file and you can use getFilePointer () as your checkpoint, which you can keep. When you execute the program again, you start off by this checkpoint by calling (offset).

Comments

Popular posts from this blog

excel vba - How to delete Solver(SOLVER.XLAM) code -

github - Teamcity & Git - PR merge builds - anyway to get HEAD commit hash? -

ios - Replace text in UITextView run slowly -