Fracture response of materials is highly sensitive to microstructural details and defect distribution. Accordingly, in fragmentation analysis and some other fracture examples, the use of deterministic and homogeneous material properties can result in nonphysical responses. The use of random fields as underlying material properties has been shown to address this problem to some extent. We use random fields with varied one- and two-point statistics as input elastic or fracture properties. Cohesive and phase field models are used to represent fracture at material interfaces and in the bulk for fragmentation analysis. The raw input for a fracture simulation includes the values of elastic and fracture fields at specified positions of a uniform spatial grid. The outputs include mechanical solutions, e.g., displacement, stress, and damage, at the same locations at later times and the histories of various energies and average forces in time. Forming the relation between the input parameters and output QoIs is quite challenging due to multiple factors including the high dimension of input space, expensive fracture simulations, and the need to simulate many such forward simulations to accurately propagate statistics from the input fields to output QoIs. We will design a machine learning workflow to identify the required data size, appropriate preprocessing methods, and learning algorithms for extracting input-output relationships. Notably, we will use dimensionality reduction techniques, such as Principal Component Analysis and its variants, to reduce the size of the input space while controlling tradeoffs between computational and statistical efficiency. Moreover, we consider the problem of model selection by searching through a wide array of learning algorithms along with various evaluation metrics to measure the generalization error of the designed system. This not only accelerates the training but also better maintains the key characteristics that most affect output QoIs.