Worst case scenario? Let's see, if one computer could go through 1,000,000,000 numbers per second. (Not even possible, I don't think), then that means:
1 sec = 1,000,000,000
1 min = 60,000,000,000
1 hour = 3,600,000,000,000
1 day = 86,400,000,000,000
1 year(365.25 days) = 31,557,600,000,000,000
1 decade (10 years) = 315,576,000,000,000,000
1 century (100 years) = 31,557,600,000,000,000,000
1 millennium (1000 years) =315,576,000,000,000,000,000
That is one computer over the course of a millennia. 1000 years only gets us 21 digits, and we need 309 digits.
So, 1e309 divided by 1e21 is 1e288 if my math is correct. So we would need 1 x 10^288 computers to factor it in a 1000 years. Remember now, we were generous at the start of this, and gave it 1 billion numbers per second.
This is the worst case scenario.