Sketch2Code : From Sketch Design On Paper To Website Interface
Akash Bharat Wadje, Prof. Rohit Bagh
HTML, Artificial Intelligence, SKETCH2CODE, Graphical User Interface
Generally when any person want to build an website they have to hire an developer for that. For this user have to spend lot of money and time. But after paying so much money and time sometimes he doesn’t get actual interface as they want. Currently, developer is the bridge between the user’s sketches and the program code development .To overcome such type of problems and restriction Microsoft Artificial Intelligence Team introduces a platform named “SKETCH2CODE”. Transforming a Handwritten Sketch (graphical user interface) created by a designer into computer code is a typical task conducted by a developer in order to build customized websites. In this paper, we show that different machine learning methods can be play major role to train a model end-to-end to generate code from a image given by user with over all different platforms support with help of browsers (i.e. iOS, Android and Windows).we present two approaches which automates this process, one using classical computer vision techniques, and another using a novel application of deep semantic segmentation networks. We uses a dataset of websites which is used to train and evaluate approaches.