Training tends to be more compute intensive while inference is more likely to be able to be ran on a smaller hardware foot print.
The neater idea would be a standard model or set of models, so that a 30G program can be used on ~80% of target case, games and video seem good canidates for this.
Training tends to be more compute intensive while inference is more likely to be able to be ran on a smaller hardware foot print.
The neater idea would be a standard model or set of models, so that a 30G program can be used on ~80% of target case, games and video seem good canidates for this.