Hello again :)
The test of the sensor script is completed right now, thus here is a new clue: The process map of the sensor script.
As seen, it reacts on five events: You can sit on the device, touch it or colide with it. You can also enter the radar area or say a certain chat command in order the sensor triggers and reports the avatar.
The chat messages are translated so the agent can say a secret word in order the sensor triggers. Collision and sensor events must pass the so called Agentfilter, in order only new agents are reported.
The script is a plug-in, which means it has a proper API, it works by its own and extends the device in plug&play manner. Thats all by now, it was just a small new tip.
Gretings
Jenna
Abonnieren
Kommentare zum Post (Atom)
Dear creators, internet connection is not a premisse
Hello everyone reading :) This post targets all the people who design and create all the fancy things, be it in hand or on screen. Pleas...
-
As noticed before , the new server version knows an interesting function, llSetLinkPrimitiveParamsFast . This fnction should set parameters ...
-
In last step we created a Meshbox made from a cube with set UVMap and LSL faces. In this step we will turn it into a GlassBox, made off two ...
-
Yesterday i was told about a program that allows the user to kinda upload money while the program is hacking into LL servers. There is even ...
Keine Kommentare:
Kommentar veröffentlichen