DeepSDFStruct.deep_sdf.training#
Functions
|
|
|
|
Interpolate between latent vectors and export reconstructed meshes. |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Classes
|
|
|
|
|
|
|
- class DeepSDFStruct.deep_sdf.training.ClampedL1Loss(clamp_val=0.1)#
Bases:
torch.nn.modules.module.Module
- forward(input, target)#
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- class DeepSDFStruct.deep_sdf.training.ConstantLearningRateSchedule(value)#
Bases:
DeepSDFStruct.deep_sdf.training.LearningRateSchedule
- get_learning_rate(epoch)#
- class DeepSDFStruct.deep_sdf.training.StepLearningRateSchedule(initial, interval, factor)#
Bases:
DeepSDFStruct.deep_sdf.training.LearningRateSchedule
- get_learning_rate(epoch)#
- class DeepSDFStruct.deep_sdf.training.WarmupLearningRateSchedule(initial, warmed_up, length)#
Bases:
DeepSDFStruct.deep_sdf.training.LearningRateSchedule
- get_learning_rate(epoch)#
- DeepSDFStruct.deep_sdf.training.append_parameter_magnitudes(param_mag_log, model)#
- DeepSDFStruct.deep_sdf.training.clip_logs(loss_log, lr_log, timing_log, lat_mag_log, param_mag_log, epoch)#
- DeepSDFStruct.deep_sdf.training.create_interpolated_meshes_from_latent(experiment_directory, indices, steps, checkpoint='latest', max_batch=32, filetype='ply', device='cpu')#
Interpolate between latent vectors and export reconstructed meshes.
This function loads a trained DeepSDF model and its latent vectors, then interpolates between consecutive latent codes specified in indices. At each interpolation step, a 3D surface mesh is reconstructed and exported to disk in the requested format.
- Parameters:
experiment_directory (str | PathLike) – Path to the experiment directory containing checkpoints and latent vectors.
checkpoint (str, optional) – Which checkpoint to load. Defaults to “latest”.
max_batch (int, optional) – Maximum batch size for inference. Defaults to 32.
filetype (str, optional) – File extension for exported meshes (e.g., “ply”, “obj”). Defaults to “ply”.
indices (list[int], optional) – Sequence of latent vector indices between which interpolation should be performed. Defaults to [1, 2, 3, 4, 5, 6, 7, 8].
steps (int, optional) – Number of interpolation steps (including endpoints). Defaults to 11.
- Return type:
None
Example
>>> create_interpolated_meshes_from_latent( ... experiment_directory="experiments/run1", ... [1, 2, 3, 4, 5, 6, 7, 8], ... 11, ... checkpoint="latest", ... max_batch=32, ... filetype="ply", ... )
- DeepSDFStruct.deep_sdf.training.get_learning_rate_schedules(specs)#
- DeepSDFStruct.deep_sdf.training.get_mean_latent_vector_magnitude(latent_vectors)#
- DeepSDFStruct.deep_sdf.training.get_spec_with_default(specs, key, default)#
- DeepSDFStruct.deep_sdf.training.load_logs(experiment_directory)#
- DeepSDFStruct.deep_sdf.training.load_optimizer(experiment_directory, filename, optimizer)#
- DeepSDFStruct.deep_sdf.training.reconstruct_meshs_from_latent(experiment_directory, checkpoint='latest', max_batch=32, filetype='ply', device='cpu')#
- DeepSDFStruct.deep_sdf.training.save_latent_vectors(experiment_directory, filename, latent_vec, epoch)#
- DeepSDFStruct.deep_sdf.training.save_logs(experiment_directory, loss_log, lr_log, timing_log, lat_mag_log, param_mag_log, epoch)#
- DeepSDFStruct.deep_sdf.training.save_model(experiment_directory, filename, decoder, epoch)#
- DeepSDFStruct.deep_sdf.training.save_optimizer(experiment_directory, filename, optimizer, epoch)#
- DeepSDFStruct.deep_sdf.training.train_deep_sdf(experiment_directory, data_source, continue_from=None, batch_split=1, device=None)#