Question2Replies1874Views×Close popoverAndrew N (AndrewN0) N/A N/A CA View Profile AndrewN0 Member since 2012 4 posts N/A Posted: December 15, 2016Last activity: May 17, 2017Posted: 15 Dec 2016 16:18 ESTLast activity: 17 May 2017 10:38 EDT Closed File listener, processing file with 1million+ records within 30 minutes...Hello, we currently have a requirement to read and process a million records within a relatively short time frame (~30 minutes).The current implementation is to leverage a file listener and parse rule but we were wondering if there was a better approach to handle this?***Moderator Edit: Vidyaranjan| Updated Categories*** Data Integration ×Close popoverFacebookTwitterLinkedinEmail Copy Link Copied! Moderation Team has archived post This thread is closed to future replies. Content and links will no longer be updated. If you have the same/similar Question, please write a new Question. Posted: 4 years agoPosted: 16 Dec 2016 1:29 EST×Close popoverVeera Gangababu Gollapalli (Gangababu) PEGA Principal Technical Solutions Engineer PEG IN View ProfileGangababu PEGA replied to AndrewN0Would this https://pdn.pega.com/community/pega-product-support/question/file-listener-handling-large-number-records help? Posted: 3 years agoPosted: 17 May 2017 10:38 EDT×Close popoverArjun Lath (ARJUNLATH) HCL Senior System Architect HCL GB View ProfileARJUNLATH HCL replied to AndrewN0Please use data flow. Handling a million records in 30 minutes is pretty easy. We have already done that in one of our projects.