購(gòu)買(mǎi)設(shè)計(jì)請(qǐng)充值后下載,,資源目錄下的文件所見(jiàn)即所得,都可以點(diǎn)開(kāi)預(yù)覽,,資料完整,充值下載可得到資源目錄里的所有文件。。?!咀ⅰ浚篸wg后綴為CAD圖紙,doc,docx為WORD文檔,原稿無(wú)水印,可編輯。。。具體請(qǐng)見(jiàn)文件預(yù)覽,有不明白之處,可咨詢(xún)QQ:12401814
邵陽(yáng)學(xué)院畢業(yè)設(shè)計(jì)(論文)任務(wù)書(shū)
專(zhuān)業(yè)班級(jí)
02機(jī)電
學(xué)生姓名
唐順祥
學(xué)號(hào)
34
課題名稱(chēng)
小型攪拌器創(chuàng)新設(shè)計(jì)
設(shè)計(jì)(論文)起止時(shí)間
2005年3月20日至2005年6月6日
課題類(lèi)型
工程設(shè)計(jì)
課題性質(zhì)
模擬
一、課題研究的目的與主要內(nèi)容
進(jìn)一步提高機(jī)械(機(jī)構(gòu))的創(chuàng)新是現(xiàn)代技術(shù)和社會(huì)發(fā)展的需要,是人們生活生產(chǎn)的前提,特別是數(shù)控技術(shù)發(fā)展的基礎(chǔ)。該課題的研究具有重要的現(xiàn)實(shí)意義,而且對(duì)學(xué)生也是一次很好的鍛煉機(jī)會(huì)。
主要內(nèi)容:從創(chuàng)新機(jī)構(gòu)、精密定位及機(jī)械零件的制造等方面來(lái)討論和解決精密機(jī)械的設(shè)計(jì)問(wèn)題。
二、基本要求
1、 了解現(xiàn)在機(jī)械技術(shù)發(fā)展情況,闡述機(jī)械設(shè)計(jì)的重要性;
2、 通過(guò)定位、機(jī)械零件的設(shè)計(jì)與制造及精密機(jī)構(gòu)的設(shè)計(jì)來(lái)闡述精密機(jī)械設(shè)計(jì)的方法及實(shí)現(xiàn)途徑;
3、 既要有理論闡述,更應(yīng)具體的設(shè)計(jì);
4、 按要求寫(xiě)出開(kāi)題報(bào)告、畢業(yè)論文。
注:1、此表由指導(dǎo)導(dǎo)教師,經(jīng)院(系)、教研室主任審批生效;
2、此表1式3份,學(xué)生、指導(dǎo)教師、教研室各1份。
三、課題研究已具備的條件(包括實(shí)驗(yàn)室、主要儀器設(shè)備、參考資料)
1、 機(jī)械工程學(xué)院數(shù)控加工實(shí)驗(yàn)室;
2、 機(jī)械工程學(xué)院公差配合與測(cè)量實(shí)驗(yàn)室;
3、 校圖書(shū)館、指導(dǎo)老師及學(xué)生本人擁有的相關(guān)圖書(shū)資料.
四、設(shè)計(jì)(論文)進(jìn)度表
3.20~3.31 收集、整理相關(guān)資料;
4.1~4.10 開(kāi)題報(bào)告;
4.11~4.20 機(jī)械創(chuàng)新設(shè)計(jì)的理論探討;
4.20~5.15 機(jī)械創(chuàng)新設(shè)計(jì)的具體設(shè)計(jì);
5.15~5.20 整理論文、準(zhǔn)備答辯.
五、教研室審批意見(jiàn)
教研室主任(簽名)
年 月 日
六、院(系)審批意見(jiàn)
院(系)負(fù)責(zé)人(簽名) 單位(公章)
年 月 日
指導(dǎo)老師(簽名) 學(xué)生(簽名)
邵 陽(yáng) 學(xué) 院
畢業(yè)設(shè)計(jì)(論文)
課 題 名 稱(chēng) 小型攪拌器創(chuàng)新設(shè)計(jì)
學(xué) 生 姓 名 唐 順 祥
學(xué) 號(hào) 0231121034
院(系)、專(zhuān)業(yè) 機(jī)械工程學(xué)院02機(jī)電班
指 導(dǎo) 教 師 趙 小 林
職 稱(chēng) 副 教 授
2005年 5 月 30 日
邵陽(yáng)學(xué)院畢業(yè)設(shè)計(jì)(論文)
摘要
此小型攪拌器是在原傳統(tǒng)的攪拌器基礎(chǔ)上把它改小并加一個(gè)振動(dòng)裝置,同時(shí)也改變轉(zhuǎn)動(dòng)結(jié)構(gòu)。它通過(guò)電動(dòng)機(jī)帶動(dòng),經(jīng)蝸輪減速器減速改變方向,通過(guò)彈性柱銷(xiāo)聯(lián)軸器連接,帶動(dòng)筒體下的主動(dòng)軸轉(zhuǎn)動(dòng),因而使筒體轉(zhuǎn)動(dòng),筒體在轉(zhuǎn)動(dòng)的同時(shí),由于筒體本身和重力,以及微體內(nèi)物料的沖擊力,從而筒體底座下的彈簧來(lái)回收縮彈起,從而達(dá)到邊轉(zhuǎn)動(dòng)邊振動(dòng)的目的。其內(nèi)容包括:總體設(shè)計(jì),傳動(dòng)裝置及電動(dòng)機(jī)的選擇,聯(lián)軸器的設(shè)計(jì),蝸桿的設(shè)計(jì),主動(dòng)軸的設(shè)計(jì),從動(dòng)軸的設(shè)計(jì),蝸輪軸及蝸輪的設(shè)計(jì),筒體的設(shè)計(jì),振動(dòng)頻率的設(shè)計(jì)。此攪拌器的創(chuàng)新之處就在于聯(lián)軸器的選擇和振動(dòng)裝置,本次設(shè)計(jì)是通過(guò)本人平時(shí)的觀察思考以及查閱各種資料,以及指導(dǎo)老師的指點(diǎn)和自己的所學(xué)的理論知識(shí)的基礎(chǔ)上完成的。
關(guān)鍵詞:傳動(dòng)裝置,蝸輪減速箱,彈性柱銷(xiāo)聯(lián)軸器,主動(dòng)軸,從動(dòng)軸,筒體,振動(dòng)頻率。
Summary
This small-scale agitator changes it small and adding a vibration device on the basis of original and traditional agitator , change the structure of rotating at the same time . It drives through the motor, is moderated the direction of changing by the worm gear decelerator, sell the shaft coupling to join through the elastic post, drive the initiative axle under the barrel to rotate, therefore make the barrel rotate, the barrel, while rotating, because of barrel and gravity, and a little internal shock power of supplies, thus spring , barrel of base is it blow to play to shrink to go back to come, thus achieve rotate purpose of vibration. Its content includes: Design overallly, the transmission device and choice of the motor, the design of the shaft coupling , the design of the worm, the design of the initiative axle , the design of the driven axle , the designs of snail's shaft and worm gear, the design of the barrel, the design of the vibration frequency. The innovation of this agitator lies in the choice of the shaft coupling and vibration device , this design is to think and consult various kinds of materials through my observation at ordinary times , and the one that finish on the counselor's instruction and foundation of one's own theory knowledge studied.
Keyword: Transmission device , worm gear gearbox, the elastic post sells the shaft coupling , initiative axle , driven axle , the barrel, vibration frequency.
目 錄
前言·······························································1
第1章 設(shè)計(jì)任務(wù)書(shū)···············································2
第2章 總體設(shè)計(jì)················································3
第2.1節(jié) 設(shè)計(jì)依據(jù)···············································3
第2.2節(jié) 工作過(guò)程的擬定·········································3
第2.3節(jié) 傳動(dòng)方案的選擇········································3
第2.4節(jié) 傳動(dòng)比的分配···········································4
第2.5節(jié) 筒體結(jié)構(gòu)的初步設(shè)計(jì)····································5
第3章 傳動(dòng)裝置及電動(dòng)機(jī)的選擇·································5第3.1節(jié) 選擇傳動(dòng)方案···········································5
第3.2節(jié) 電動(dòng)機(jī)的選擇···········································5
第3.3節(jié) 傳動(dòng)比的分配···········································6
第3.4節(jié) 計(jì)算傳動(dòng)裝置的運(yùn)動(dòng)參數(shù)和動(dòng)力參數(shù)···················7
第4章 聯(lián)軸器設(shè)計(jì)··············································9第4.1節(jié) 選擇聯(lián)軸器的類(lèi)型······································9
第4.2節(jié) 計(jì)算聯(lián)軸器的轉(zhuǎn)矩·····································10
第4.3節(jié) 聯(lián)軸器的確定以及其基本尺寸·························10
第4.4節(jié) 聯(lián)軸器的附加說(shuō)明····································10
第5章 蝸桿的設(shè)計(jì)···············································12
第5.1節(jié) 選擇蝸桿傳動(dòng)類(lèi)型·····································12
第5.2節(jié) 選擇材料··············································12
第5.3節(jié) 按齒面接觸疲勞強(qiáng)度進(jìn)行設(shè)計(jì)·························12
第5.4節(jié) 蝸桿與蝸輪的主要參數(shù)與幾何尺寸·····················13
第5.5節(jié) 驗(yàn)算傳動(dòng)比············································14
第5.6節(jié) 校核齒根彎曲疲勞強(qiáng)度································14
第5.7節(jié) 精度等級(jí)公差和表面粗糙度的確定··························15
第5.8節(jié) 初步確定蝸桿上的最小直徑····························15
第5.9節(jié) 蝸桿的結(jié)構(gòu)與各段直徑和長(zhǎng)度·························16
第5.10節(jié) 初步選取滾動(dòng)軸承·····································16
第5.11節(jié) 蝸桿上零件的周向定位·······························17
第6章 主動(dòng)軸的設(shè)計(jì)············································18
第6.1節(jié) 初步確定軸的最小直徑································18
第6.2節(jié) 軸的結(jié)構(gòu)設(shè)計(jì)·········································18
第6.3節(jié) 求軸上的載荷·········································19
第6.4節(jié) 求作用在軸上的力·····································20
第6.5節(jié) 按彎扭合成應(yīng)力校核軸的強(qiáng)度·························20
第7章 主動(dòng)軸的設(shè)計(jì)···········································21
第7.1節(jié) 從動(dòng)軸的結(jié)構(gòu)設(shè)計(jì)·····································21
第7.2節(jié) 從動(dòng)軸的工作結(jié)構(gòu)圖··································21
第8章 蝸輪軸的結(jié)構(gòu)設(shè)計(jì)及蝸輪的結(jié)構(gòu)設(shè)計(jì)····················22
第8.1節(jié) 蝸輪結(jié)構(gòu)··············································22
第8.2節(jié) 求出蝸輪上的功率P轉(zhuǎn)速n·····························22第8.3節(jié) 求作用在蝸輪上的力···································22第8.4節(jié) 軸的結(jié)構(gòu)設(shè)計(jì)··········································23
第8.5節(jié) 根據(jù)軸向定位要求確定軸的各段直徑和長(zhǎng)度··········23第8.6節(jié) 確定軸上圓角和倒角尺寸······························25
第9章 筒體的設(shè)計(jì)··············································26第9.1節(jié) 筒體結(jié)構(gòu)的確定·······································26第9.2節(jié) 確定筒體的壁厚·······································26
第9.3節(jié) 送料孔和卸料孔的設(shè)計(jì)································26第9.4節(jié) 筒體各部分尺寸的確定································27第9.5節(jié)筒體的技術(shù)要求··········································27第10章 振動(dòng)頻率的設(shè)計(jì)········································28
第10.1節(jié) 用瑞利法求筒體的固有頻率··························28第10.2節(jié) 振幅的設(shè)計(jì)···········································30
小結(jié) ·····························································33
參考文獻(xiàn) ························································34致謝·····························································35
HAVE'2007 - IEEE International Workshop on Haptic Audio Visual Environments and their Applications Ottawa, Canada 12-14 October 2007
Extending Blender: Development of a Haptic Authoring Tool
Sheldon Andrews', Mohamad Eid2, Atif Alamri2, and Abdulmotaleb El Saddik2 Multimedia Communications Research Laboratory – MCRLab School of Information Technology and Engineering - University of Ottawa, Ontario, KIN 6N5, Canada'sandrO71]@site. uottawa. ca, 2 teid, atifW abed] @ mcrlab. uottawa. Ca
Abstract:In this paper, we present our work to extend a well known 3D graphic modeler - Blender - to support haptic modeling and rendering. The extension tool is named HAMLAT (Haptic Application Markup Language Authoring Tool). We describe the modifications and additions to the Blender source code which have been used to create HAMLAT Furthermore, we present and discuss the design decisions used when developing HAMLAT, and also an implementation "road map" which describes the changes to the Blender source code. Finally, we conclude
with discussion of our future development and research avenues.
Keywords:Haptics HAML Graphic Modelers Blender Virtual Environments
I. INTRODUCTION
A. Motivation
The increasing adoption of haptic modality in human-computer interaction paradigms has led to a huge demand for new tools that help novice users to author and edit haptic applications. Currently, the haptic application development process is a time consuming experience that requires programming expertise. The complexity of haptic applications development rises from the fact that the haptic application components (such as the haptic API, the device, the haptic rendering algorithms, etc.) need to interact with the graphic components in order to achieve synchronicity.
Additionally, there is a lack of application portability as the application is tightly coupled to a specific device that necessitates the use of its corresponding API. Therefore, device and API heterogeneity lead to the fragmentation and disorientation of both researchers and developers. In view of all these considerations, there is a clear need for an authoring tool that can build haptic applications while hiding programming details from the application modeler (such as API, device, or virtual model).
This paper describes the technical development of the Haptic Application Markup Language Authoring Tool (HAMLAT). It is intended to explain the design decisions used for developing HAMLAT and also provides an implementation "road map", describing the source code of the project.
B. Blender
HAMLAT is based on the Blender [1] software suite, which is an open-source 3D modeling package with a rich feature set. It has a sophisticated user interface which is
noted for its efficiency and flexibility, as well as its supports for multiple file formats, physics engine, modem computer graphic rendering and many other features.
Because of Blender's open architecture and supportive community base, it was selected as the platform of choice for development of HAMLAT. The open-source nature of Blender means HAMLAT can easily leverage its existing functionality and focus on integrating haptic features which make it a complete hapto-visual modeling tool, since developing a 3D modeling platform from scratch requires considerable development time and expertise in order to reach the level of code into the HAMLAT source tree.
HAMLAT builds on existing Blender components, such as the user-interface and editing tools, by adding new components which focus on the representation, modification, and rendering of haptic properties of objectsin a 3D scene. By using Blender as the basis for HAMLAT, we hope to develop a 3D haptic modeling tool
which has the maturity and features of Blender combined
with the novelty of haptic rendering.
At the time of writing, HAMLAT is based on Blender version 2.43 source code.
C. Project Goals
As previously stated, the overall goal for the HAMLAT project is to produce a polished software application which combines the features of a modem graphic? modeling tool with haptic rendering techniques. HAMLAT has the "look and feel" of a 3D graphical modeling package, but with the addition of features such as haptic rendering and haptic property descriptors. This allows artists, modelers, and developers to generate realistic 3D hapto-visual virtual environments.
A high-level block diagram of HAMLAT is shown in Figure 1. It illustrates the flow of data in the haptic modeling. HAMLAT assists the modeler, or application developer, in building hapto-visual applications which may be stored in a database for later retrieval by another haptic application. By hapto-visual application we refer to any software which displays a 3D scene both visually and haptically to a user in a virtual setting. An XML file format, called HAML [2], is used to describe the 3D scenes and store the hapto-visual environments built by a modeler for later playback to an end user.
Traditionally, building hapto-visual environments has required a strong technical and programming background. The task of haptically rendering a 3D scene is tedious 1108 since haptic properties must be assigned to individual objects in the scene and currently there are few high-level tools for accomplishing this task. HAMLAT bridges this gap by integrating into the HAML framework and delivering a complete solution for development of hapto- visual applications requiring no programming knowledge.
The remainder of the paper is organized as follows: in Section 2, we present the proposed architecture extensions and discuss design constraints. Section 3 describes the implementation details and how haptic properties are added and rendered within the Blender framework. In Section 4 we discuss related issues and future work avenues.
II. SYSTEM OVERVIEW AND ARCHITECTURE
The Blender design philosophy is based on three main tasks: data storage, editing, and visualization. According to the legacy documentation [3], it follows a data- visualize-edit development cycle for the 3D modeling pipe line. A 3D scene is represented using data structures within the Blender architecture. The modeler views the scene, makes changes using the editing interface which directly modifies the underlying data structures, and then the cycle repeats.
To better understand this development cycle, consider the representation of a 3D object in Blender. A 3D object may be represented by an array of vertices which have
been organized as a polygonal mesh. Users may choose to operate on any subset of this data set. Editing tasks may include operations to rotate, scale, and translate the
vertices, or perhaps a re-meshing algorithm to "cleanup" redundant vertices and transform from a quad to a triangle topology. The data is visualized using a graphical 3D renderer which is capable of displaying the object as a wireframe or as a shaded, solid surface. The visualization is necessary in order to see the effects of editing on the data. In a nutshell, this example defines the design philosophy behind Blender's architecture.
In Blender, data is organized as a series of lists and base data types are combined with links between items in each list, creating complex scenes from simple structures.
This allows data elements in each list to be reused, thus reducing the overall storage requirements. For example, a mesh may be linked by multiple scene objects, but the position and orientation may change for each object and the topology of the mesh remains the same. A diagram illustrating the organization of data structures and reuse of scene elements is shown in Figure 2. A scene object links to three objects, each of which link to two polygonal meshes. The meshes also share a common material property. The entire scene is rendered on one of several screens, which visualizes the scene.
We adopt the Blender design approach for our authoring tool. The data structures which are used to represent objects in a 3D scene have been augmented to include fields for haptic properties (e.g., stiffness, damping); user interface components (e.g., button panels) which allow the modeler to change object properties have also been updated to include support for modifying the haptic properties of an object. Additionally, an interactive hapto-visual renderer has been implemented to display the
3D scene graphically and haptically, providing the modeler or artist with immediate feedback about the changes they make to the scene. in the current version of the HAMLAT. the modifications to the Blender framework include: data structures for representing haptic properties, an editing interface for modifying haptic properties, an external renderer for displaying and previewing haptically enabled scenes, scripts which allow scenes to be imported/exported in the HAML file format.
A class diagram outlining the changes to the Blender ramework is shown in Figure 3. Components which are ertinent to HAMLAT are shaded in gray. HAMLAT builds on existing Blender sub-systems by extending them or haptic modeling purposes. Data structures for representing object geometry and graphical rendering areaugmented to include field which encompass the tactile properties necessary for haptic rendering.
To allow the user to modify haptic properties GUI Components are integrated as part of the Blender editing panels. The operations triggered by these components
operate directly on the d ata structures used for representing hatic cues and may be considered part of the editing step of the Blender design cycle.
Similarly to the built-in graphical renderer, HAMLAT uses a custom rendlerer for displaying 3Ds scenes grphcal and haptcall, an is ineedn of the Blender renderer. This component is developed independently since haptical and graphical rendering must be performed simultaneously and synchronously. A simulation loop is used to update haptic rendering forces at a rate which maintains stability and quality. A detailed discussion of the implementation of these classes and their connectivity is given in the next section.
III? IMLIEMENTATION
A? Data Structure
A.1 Mesh Data Type
Blender uses many different data structures to represent the various types of objects in a 3D scene a vertices; a lamp contains colour and intensity values; and camera a object contains intrinsic viewing parameters.
The Mesh data structure iS used by the Blender inframework to describe a polygonal mesh object. It iS of particular interest for hapic? rendering since many solid objects in a 3D scene? may be represented using this type?? of data structure. The tactile and kinesthetic cues, which are displayed due to interaction with virtual objects, are typically rendered based on the geometry of the mesh. Hptic rendering is performed based primary on data stored in this data type. Other scene components such as lamps, cameras, or lines are not intuitively rendered using force feedback haptic devices and are therefore not of current interest for haptic rendering.
??? An augmented version of the Mesh data structure is shown in Figure 4. It contains fields for vertex and face data, plus some special custom data fields which allow data to be stored to/retrieved from disk and memory. We have modified this data type to include a pointer to a MHaptics data structure, which stores haptic properties such as stiffness, damping, and friction for the mesh elements (Figure 5).
A.2 Edit Mesh Data Type
It should be noted that the Mesh data type has a comPlimentary data structure, called EditMesh, which is used when editing mesh data. It holds a copy of the vertex, edge ,and face data for a polygonal mesh. when the user switches to editing mode, the Blender copies the data from a Mesh into an EditMesh and when editing is complete the data is copied back.
Care must be taken to ensure that the hapic property data structure remains intact during the copy sequence. The EditMesh data structure has not been modified to contain a copy of the hapic property data ,but this may properties in edit mode is required). The editing mode is mainly used to modify mesh topology and geometry, not the haptic and graphical rendering characteristics, A.3 Haptic Properties
In this section we'll briefly discuss the haptic properties which may currently be modeled using HAMLAT. It is important for the modeler to understand these? properties and their basis for use in haptic rendering.
The stiffness of an object defines how resistant it is to deformation by some applied force. Hard objects, such as a rock or table, have very high stiffness; soft objects, such as rubber ball, have low stiffness. The hardness-softness of an object is typically rendered using th spring-force equation:?
Where the force feedback vector f which is displayed to the user is computed using ks the stiffness coefficient (variable name stiffness)for the object and x the? penetration depth (displacement) of the haptic proxy into an object. The stiffness coefficient has a range of [0,1], where 0 represents no resistance to deformation and 1 represents the maximum stiffness which may be rendered by the haptic device. The damping of an object defines its resistance to the rate of deformation due to some applied force. It is typically rendered using the force equation:
? Where kd is the damping coefficient (variable name}MHaptics; damping) anddepdt is the velocity ofthe haptic proxy as it;penetrates an object. The damping coefficient also has a range of [0,1] and may be used to model viscous behaviour of a material. It also increases the stability of? the hapticrendering loop fordstiffmaterials.
The static friction (variable name stjriction) and dynamic friction (variable name dyjriction) coefficient are used to model the frictional forces experienced whileexploring the surface of a 3D object. Static friction is experienced when the proxy is not moving over the object's surface, and an initial force must be used to overcome static friction. Dynamic friction is felt when the proxy moves across the surface, rubbing against it.
Frictional coefficients also have a range of /0,1], with a value of 0 making the surface of a 3D object feel "slippery" and a value of 1 making the object feel very
rough. Frictional forces are typically rendered in a direction tangential to the collision point of the hapticproxy at an object's surface.? B. Editing Blender uses a set of non-overlapping windows called spaces to modify various aspects of the 3D scene and its objects. Each space is divided into a set of areas andpanels which are context aware. That is, they provide functionality based on the selected object type. For example, if a camera is selected the panel will display components which allow the modeler to change the focal length and viewing angle of the camera, but these components will not appear if an object of another type is selected.
Figure 6 shows a screen shot of the button space which is used to edit properties for a haptic mesh. It includes user-interface panels which allow a modeler to change the graphical shading properties of the mesh, perform simple re-meshing operations, and to modify the haptic properties of the selected mesh.
HAMLAT follows the context-sensitive behavior of Blender by only displaying the haptic editing panel when a polygonal mesh object is selected. In the future, this panel may be duplicated to support haptic modeling for other object types, such as NURB surfaces. The Blender framework offers many user-interface components (e.g., buttons, sliders, pop-up menus) which may be used to edit the underlying data structures. The haptic properties for mesh objects are editable using sliders or by entering a float value into a text box located adjacent to the slider. When the value of the slider/text box is changed, it triggers an event in the Blender window sub-system. A unique identifier that the event is for the haptic property panel and the HAMLAT code should be called to update haptic properties for the currently selected mesh.
C Hapto-Visual Rendering
Blender currently support graphical rendering of scenes using an internal render or an external renderer (e.g., [4]). In this spirit, the haptic renderer used by HAMLAT has been developed as an exteral renderer. It uses the OpenGL and OpenHaptics toolkit [5] to perform graphic and hapic rendering ,respectively.
The 3D scene which is being modeled is rendered using two passes: the first pass render the scene graphically, and the second pass renders it haptically. The second pass is required because the OpenHaptics toolkit intercepts commands send to the OpenGL pipeline and uses them to display the scene using haptic rendering techniques. In this pass, the haptic properties of each mesh object are used much in the same way color and lighting are used by graphical rendering they define the
type of material for each object. To save CPU cycles, the lighting and graphical material properties are excluded from the haptic rendering pass.
Figure 7 shows source code which is used to apply the material properties during the haptic rendering pass. The haptic renderer is independent from the Blender
framework in that it exists outside the original source code. However, it is still heavily dependent on Blender data structures and types.
D. Scripting
The Blender Python (BPy) wrapper exposes many of the internal data structures, giving the internal Python scripting engine may access them. Similar to the data
structures used for representing mesh objects in the native Blender framework, wrappers allow user defined scripts to access and modify the elements in a 3D scene.
????The hapic properties of a mesh object may be accessed through the Mesh wrapper class. A haptics attribute has been added to each of these classes and accessed through the Python scripting system. Figure 8 shows Python code to read the haptic properties from a mesh object and export to a file. Similar code is used to? import/export HAML scenes from/to files.
An import script allows 3D scenes to be read from a HAML file and reproduced in the HAMLAT application; export script allows 3D scenes to be written to a HAML file, including haptic properties, and used in other HAML applications.?
??? The BPy wrappers also expose the Blender windowing system. Figure 9 shows a panel which appears when the user exports a 3D scene to the HAML file format. It
allows the user to specify supplementary information about the application such as a description, target hardware, and system requirements.
These are fields defined by the HAML specification [2] and are included with the authored scene as part of the HAML file format. User-interface components displayed on this panel are easily extended to agree with the future revisions of HAML.
The current version of HAMLAT shows that a unified modeling tool for graphics and haptics is possible. Promisingly, the features for modeling haptic properties
have been integrated seamlessly into the Blender framework, which indicates it was a good choice as a? platform for development of this tool. Blender's modular architecture will make future additions to its framework very straightforward.
Currently, HAMLAT supports basic functionality for modeling and rendering hapto-visual applications. Scenes may be created, edited, previewed, and exported as part of a database for use in by other hapto-visual applications, such as the HAML player [6]. However, there is room for growth and in there are many more ways we can continue leveraging existing Blender functionality.
??? As per future work ,we plan to extend HAMLAT TO include support for other haptic platforms and devices.Currently, only the PHANTOM series of devices is supported since the interactive renderer is dependent on the OpenHaptics toolkit [5].In order to support otherd evices, a cross-platform library such as Chai3D or