I need a solution
When watching an active job running, it'll usually start up at a high number and over time taper down and stabilize. Then when the job is done, you have a throughput number that is what looks to be the average throughput based on total data backed up over time.
How internally does BackupExec calculate this? This would be handy to understand when troubleshooting performance issues where you want to know the real throughput and not the calculated throughput.