Welcome!

Welcome to the official BlackBerry Support Community Forums.

This is your resource to discuss support topics with your peers, and learn from each other.

inside custom component

Native Development

Reply
Highlighted
New Contributor
Posts: 8
Registered: ‎03-21-2011
My Device: developer
My Carrier: n/a
Accepted Solution

Executing self generated code (e.g. can we have JIT compiler?)

Does the Playbook OS, like Apple's iOS, prevent an application from self-generating its own code to execute?

 

For instance, I cannot run Google's V8 javascript engine on Apple's iOS because the operating system will not allow the application to have a memory page that is both executable and writable.(to prevent dodgy trojan style apps)

 

So I just wanted to know if there are similar protections on the playbook, before I waste too much time!

 

Ultimately I would like to port  V8 to the playbook - the ability to embed Javascript code in a native app is pretty useful! 

 

cheers

 

 

Developer
Posts: 6,473
Registered: ‎12-08-2010
My Device: PlayBook, Z10
My Carrier: none

Re: Executing self generated code (e.g. can we have JIT compiler?)

I can't answer the main question about self-generated code (other than to note that the part about memory page access permissions doesn't apply to interpreted languages like Java, Javascript, or Python anyway, since the generated "code" is really just data for the underlying runtime, not true machine code).

I wanted to note, however, that V8 appears to be the Javascript engine used by QtQuick and it seems likely that this means it has already been ported and will be available shortly as part of the main Qt port. I may be wrong about that... but it's worth checking into to save yourself the effort.

Peter Hansen -- (BB10 and dev-related blog posts at http://peterhansen.ca.)
Author of White Noise and Battery Guru for BB10 and for PlayBook | Get more from your battery!
New Contributor
Posts: 8
Registered: ‎03-21-2011
My Device: developer
My Carrier: n/a

Re: Executing self generated code (e.g. can we have JIT compiler?)

Cheers for the rapid reply!

 

But in fact it is still relevant to languages such as Javascript and Google's V8 javascript engine, because a Just-in-time compiler actually generates native machine code instructions on the fly, instead of simply interpreting the javascript code. This leads to a dramatic increase in performance.

 

And in fact V8 will not run in an `interpreted mode` anyway.


So V8 works on android but not on non-jailbroken-iOS (due to the previous mentioned security restrictions), so I want to know what to expect on the Playbook platform.

 

So I am really just hoping for a yes or no answer from someone in the know about executing code in writeable areas of an application's memory.

 

Thanks again

 

Developer
Posts: 6,473
Registered: ‎12-08-2010
My Device: PlayBook, Z10
My Carrier: none

Re: Executing self generated code (e.g. can we have JIT compiler?)

Ah, I didn't realize you were talking about the JIT but thought rather you were referring merely to things like eval() in Javascript, where it in effect generates code at run time.  Yes, you're correct JIT stuff is affected by such controls.

 

I still don't know if we have such things on the PlayBook, but if we did get an answer about V8 in the QtQuick port it would be a way of answering your question.  If as you say it has no interpreted mode, then its presence in the port would both prove there are no such controls, as well as eliminating the need for you to do the work anyway.


Peter Hansen -- (BB10 and dev-related blog posts at http://peterhansen.ca.)
Author of White Noise and Battery Guru for BB10 and for PlayBook | Get more from your battery!
BlackBerry Development Advisor
Posts: 683
Registered: ‎11-29-2011
My Device: PRIV
My Carrier: Rogers

Re: Executing self generated code (e.g. can we have JIT compiler?)

I just tried the following code:

 

int foo(char *str) { int len=0; while(*str++) len++; return len; }
// I am naively assuming afterfoo() will follow foo() in code space here in order to determine the size of foo()
void afterfoo() { }

int main (int argc, char** argv ) {
	int size=(int)&afterfoo-(int)&foo;
	int (*ptr)(char*) = malloc(size);
	if (mprotect(ptr, size, PROT_EXEC|PROT_WRITE|PROT_READ|PROT_NOCACHE) != 0) {
		printf("mprotect error %d\n", errno); 
		return -1; 
	}
	memcpy(ptr, &foo, size);
	printf("running ptr(\"test\"): %d\n", ptr("test"));
	free(ptr);
	return 0;
}

 

This produces the expected output:

$ ./test
running ptr("test"): 4

 

So in principal, it is certainly possible to execute code from the heap.

 

However...  if I invoke strlen() from inside foo(), the program core dumps on me.  It is possible that some more complex compiler / linker flags are required for success in that case.

 

Cheers,

Sean

BlackBerry Development Advisor
Posts: 683
Registered: ‎11-29-2011
My Device: PRIV
My Carrier: Rogers

Re: Executing self generated code (e.g. can we have JIT compiler?)

[ Edited ]

Okay, I resolved my function-call core-dump when trying to call strlen() from inside foo()...

The compiler was generating relative (short) branch instructions for my call to strlen().

There are several ways to work around this:

 

1. use -mlong-calls when compiling, though this may turn on long calls unnecessarily elsewhere, when all I really wanted was long calls inside my foo() function.

 

2. use #pragma long_call / #pragma long_call_off or __attribute__((long_call)) on the functions you wish to generate long calls for.  you may also need __attribute__((weak)) if you are declaring the function in the same file as it is being called from.  eg:

 

__attribute__((long_call,weak)) int bar(const char *str) { 
return strlen(str);
}

 int foo(char *str) { return bar(str); // this will generate a long-call to bar() which wraps strlen() for me.
 }

*Note that the call to strlen() in bar() will not be a long call, but since the bar() wrapper is not executing from the heap, this is okay.

 

 

3. load explicit function call addresses inside the foo() function:

int foo(char *str) {
	size_t (*strlenptr)(const char*) = &strlen;
	return strlenptr(str);
}

 

Cheers,

Sean

Developer
Posts: 1,280
Registered: ‎03-03-2011
My Device: Playbook, Z10, Q10, Z30 with Files & Folders and Orbit of course
My Carrier: Vodafone

Re: Executing self generated code (e.g. can we have JIT compiler?)

I have not read about any such limitations, but have not tried myself. I believe the AS3 virtual machine (called AVM2) has contained a JIT compiler since FP9. It uses (or at least has used in the past) a nanojit tracing JIT compiler.

 

You may like to check out this open-sourced SpiderMonkey port for PlayBook: https://github.com/blackberry/SpiderMonkey. SpiderMonkey is the JS engine for FireFox. According to this page it contains the same tracing JIT compiler as AVM2 and emits native machine code. 

Files & Folders, the unified file & cloud manager for PlayBook and BB10 with SkyDrive, SugarSync, Box, Dropbox, Google Drive, Google Docs. Free 3-day trial! - Jon Webb - Innovatology - Utrecht, Netherlands
New Contributor
Posts: 8
Registered: ‎03-21-2011
My Device: developer
My Carrier: n/a

Re: Executing self generated code (e.g. can we have JIT compiler?)

Sean you legend, that was exactly what I was after, thanks for taking the time to comprehensively answer my question.

 

This is leaving me with a great first impression of the support forums.

 

cheers

 

will

New Contributor
Posts: 8
Registered: ‎03-21-2011
My Device: developer
My Carrier: n/a

Re: Executing self generated code (e.g. can we have JIT compiler?)

@Innovatology  Yeah I just wanted to be sure that there was no iOS like security restriction on a application code - iOS also has JIT available for its builtin software but not for 3rd party apps

 

I did also have a look at https://github.com/blackberry/SpiderMonkey which actually led me to post here. They have ported release 1.8.0 which is oddly the very last version without a JIT (added in 1.8.1) which caused me to wonder why this choice was made...

 

So it is good to know that no such restriction exists!

 

If I get anywhere with porting V8 I shall post a link to my github repository for anyone who might be interested

 

Many thanks for everyones input

 

Will