Three-dimensional (3D) modeling of non-linear objects from stylized sketches is a challenge even for computer graphics experts. The extrapolation of object parameters from a stylized sketch is a very complex and cumbersome task. In the present study, we propose a broker system that can transform a stylized sketch of a tree into a complete 3D model by mediating between a modeler and a 3D modeling software. The input sketches do not need to be accurate or detailed: They must only contain a rudimentary outline of the tree that the modeler wishes to 3D model. Our approach is based on a well-defined Deep Neural Network architecture, called TreeSketchNet (TSN), based on convolutions and capable of generating Weber and Penn [1995] parameters from a simple sketch of a tree. These parameters are then interpreted by the modeling software, which generates the 3D model of the tree pictured in the sketch. The training dataset consists of synthetically generated sketches that are associated with Weber–Penn parameters, generated by a dedicated Blender modeling software add-on. The accuracy of the proposed method is demonstrated by testing the TSN with synthetic and hand-made sketches. Finally, we provide a qualitative analysis of our results, by evaluating the coherence of the predicted parameters with several distinguishing features.

TreeSketchNet: From Sketch to 3D Tree Parameters Generation

Gilda Manfredi
;
Nicola Capece;Ugo Erra;Monica Gruosso
2023-01-01

Abstract

Three-dimensional (3D) modeling of non-linear objects from stylized sketches is a challenge even for computer graphics experts. The extrapolation of object parameters from a stylized sketch is a very complex and cumbersome task. In the present study, we propose a broker system that can transform a stylized sketch of a tree into a complete 3D model by mediating between a modeler and a 3D modeling software. The input sketches do not need to be accurate or detailed: They must only contain a rudimentary outline of the tree that the modeler wishes to 3D model. Our approach is based on a well-defined Deep Neural Network architecture, called TreeSketchNet (TSN), based on convolutions and capable of generating Weber and Penn [1995] parameters from a simple sketch of a tree. These parameters are then interpreted by the modeling software, which generates the 3D model of the tree pictured in the sketch. The training dataset consists of synthetically generated sketches that are associated with Weber–Penn parameters, generated by a dedicated Blender modeling software add-on. The accuracy of the proposed method is demonstrated by testing the TSN with synthetic and hand-made sketches. Finally, we provide a qualitative analysis of our results, by evaluating the coherence of the predicted parameters with several distinguishing features.
2023
File in questo prodotto:
File Dimensione Formato  
3579831.pdf

solo utenti autorizzati

Tipologia: Pdf editoriale
Licenza: Versione editoriale
Dimensione 2.85 MB
Formato Adobe PDF
2.85 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11563/166296
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact