[Feature Request / Pipe Dream] No Pen Lag for 3rd Party Note Apps


#1

Note: To my current knowledge, most Boox devices use a single API to unify the way the screen updates, so because of that, I will treat this suggestion like a software problem that can be implemented for all Boox devices, not just the Note Series.

Disclaimer: I am a developer, but I am not super familiar with the specifics of the Scribble, TouchHelper, and Pen APIs listed under Onyx Intl’s Github. I have not written my own port with them yet, so most of my information is based off of reading the docs. So, this means that while I am basing this idea off of something, it could easily be that my understanding is flawed at this time.

The Problem: Honestly, the boox devices’ embrace of Android and its ecosystem is such a smart move in the face of Amazon’s closed nature of their kindle devices. There are tons of ebook-aimed apps, and there’s no shortage of ways to read on any boox device. However, the only thing standing in the way of making this device legendary is the fact that the same openness for 3rd party note-taking apps is hindered by the fact that the 3rd party apps would require the developers to add support for this tablet, an occurance that is unlikely for big-name apps like OneNote.

The Solution: If possible, I would love to see a “Note Mode” section in the app optimization menu. Basically, this would “simulate” the currently-supported method of updating the screen, but instead of requiring the app developer to provide the API with the information needed to update the screen, the user could do that with the items in this Note Mode section. Basically, the options it’d have are:

  1. Toggle to enable “notes mode” in the app in question
  2. Some UI element that controls for stroke width, style, and color.
  3. A UI element that customizes the way eraser-marks should be rendered. It should allow the user to specify if the app itself should handle it, or if it’s OK for the API to render (for example) paint a custom-colored/patterned line on the canvas to signify the path of the eraser just before things are erased.
  4. A UI element that customizes the way the lasso tool should be rendered. Like the eraser, it should be a choice between the app handling everything (so basically, no lasso tool in the eyes of the API), or if the lasso tool should also have its own custom colored/patterned line that should be drawn by the API.
  5. An optional “in-app floating toolbar” to display when “notes mode” is enabled. This would give the user the ability to change the rendering of stroke style, color, and width while the app is running.
  6. An option to have the toolbar be hidden unless the pen’s side button is pressed.

Once “notes mode” is on and the elements are properly configured, the app is good to go! The app should render normally until the digitizer is within hover-distance to the screen (or when indicated by input from the floating button). Then, the app’s rendering is “frozen”, and the Pen/TouchHelper API is called to properly render the pen and eraser’s strokes. While the app’s rendering is “frozen”, the app is actually still running, so when the pen gets pulled away and the changes need to be made permanent, everything that the Pen/TouchHelper API rendered should be erased, and the rendering of the app should be restored.

If this happens quickly enough, the experience will probably be kinda wonky, because of the potential differences in rendering systems between the 3rd party app and the Pen/TouchHelper API. It’ll look like you’re using the notes app when you draw, but then the image will snap into whatever the 3rd party app had in mind when it processed the input. However, it will still provide a solution that is responsive enough to allow 3rd party note apps to be usable, which would make it worth the wonkyness in my book.

The Implementation: Because this is such a far-out idea, I’ll provide some intuition regarding how I might go about doing this. To implement this, I’d probably use the Pen/TouchHelper API instead of the Scribble API, Below, there’s a quoted section[1] from the docs. Given this information, the biggest issues for implementation are the as follows:

  1. Figuring out what the RawInputCallback object should do, so we can pass it to TouchHelper.create(...).
  2. Figuring out what stroke width to give to touchHelper.setStrokeWidth(...), and what stroke style to give to touchHelper.setStrokeStyle(...).

The 2nd issue is actually solved by the 2nd, 5th, and 6th options provided in the optimization menu. If we assume that data is provided freely by the user, this immediately becomes a non-issue, becasue you could just pull from that. The 1st is a bit more complicated, because it involves deciding what the best way to respond to certain kinds of pen-inputs is. It is, however, solved by the 3rd and 4th options provided in the optimization menu. So, the behavior of when and when not to render things is decided and customized by the user’s configuration.

[1] Quoted Pen SDK Docs

2. Init TouchHelper

TouchHelper.create(view, callback)
           .setStrokeWidth(3.0f)
           .setLimitRect(limit, exclude)
           .openRawDrawing();

view is that you want to scribe. callback is RawInputCallback that you can receive data being scribbled. limit is a rect specify the region you want to scribble on the view. exclude is a list of Rect to be excluded from the view.

3. Control Pen
After TouchHelper().openRawDrawing(), you can call touchHelper.setRawDrawingEnabled(true) to start scribbling, touchHelper.setRawDrawingEnabled(false); to pause.
You can call
touchHelper.setRawDrawingRenderEnabled(false) to disable render during scribble, and you can call touchHelper.setStrokeStyle(); to set stroke style. In order to fully stop TouchHelper, you need call touchHelper.closeRawDrawing().

4. Recieve Input Data
Pen :

RawInputCallback.onBeginRawDrawing() 
-> RawInputCallback.onRawDrawingTouchPointMoveReceived() 
-> RawInputCallback.onRawDrawingTouchPointListReceived() 
-> RawInputCallback.onEndRawDrawing()

erase :

RawInputCallback.onBeginRawErasing() 
-> RawInputCallback.onRawErasingTouchPointMoveReceived() 
-> RawInputCallback.onRawErasingTouchPointListReceived() 
-> RawInputCallback.onEndRawErasing()
var callback = new RawInputCallback() {
    
    @Override
    public void onBeginRawDrawing(boolean b, TouchPoint touchPoint) {
        // begin of stylus data
    }
    
    @Override
    public void onEndRawDrawing(boolean b, TouchPoint touchPoint) {
        // end of stylus data
    }
    
    @Override
    public void onRawDrawingTouchPointMoveReceived(TouchPoint touchPoint) {
        // stylus data during stylus moving
    }
    
    @Override
    public void onRawDrawingTouchPointListReceived(TouchPointList touchPointList) {
        // cumulation of stylus data of stylus moving, you will receive it before onEndRawDrawing
    }
    
    @Override
    public void onBeginRawErasing(boolean b, TouchPoint touchPoint) {
        // same as RawData, but triggered by stylus eraser button
    }
    
    @Override
    public void onEndRawErasing(boolean b, TouchPoint touchPoint) {
        // same as RawData, but triggered by stylus eraser button
    }
    
    @Override
    public void onRawErasingTouchPointMoveReceived(TouchPoint touchPoint) {
        // same as RawData, but triggered by stylus eraser button
    }
    
    @Override
    public void onRawErasingTouchPointListReceived(TouchPointList touchPointList) {
        // same as RawData, but triggered by stylus eraser button
    }
};

#2

TL;DR

I completely agree that such an implementation would be extremely useful, as all third-party note-taking apps are currently completely useless. However, I don’t know if it’s technically possible, and even if it is, I doubt it’s going to happen, considering how long we’ve been waiting to get elementary features like PDF output files with proper vector graphics rather than low-resoultion pixels.

I think the right thing to do is to push for developers of third-party note apps to use the official Onyx Boox SDK to make the apps run as smoothly as the first-party one. I have seen users making suggestions of such implementations on the official message boards associated with several apps like Squid, LectureNotes, and Evernote. Again, given how niche a product Onyx readers are, this is not something I’d expect anytime soon. The best thing you could probably do is offer your assistance as a developer.


#3

Honestly, writing a port of existing android apps is definitely on my todo list. I don’t have an incredible amount of time to pursue such projects because I’m a full time student right now, but I’m a huge fan of apps like Stylus Labs’ Write, so decompiling the apk to see how easy/hard it’d be to throw support into it is definitely gonna be up there in terms of personal projects.

That being said, I do think this idea is technically feasible, and is arguably more practical than expecting big name 3rd party app developers to implement an e-ink port for a fringe minority of their user base. In most cases, I just don’t see it happening, so it’s either up to the company to make sure other solutions are supported, or up to the users to support those solutions themselves.


#4

Thank you for your suggestions here. Our relative colleagues are working to improve the feature of 3-rd party Note Apps. Please be patient.


#5

Really? I was unaware there was work being done to improve 3rd party note apps. I look forward to the firmware update in that case. Can you give any ETA on that, or is it just “in the future?”


#6

This sounds to me like the average “we’re always looking at ways to improve our products” stuff. I wouldn’t expect any groundbreaking changes in the foreseeable future, simply because I believe it is technically very hard to get third-party note apps to work properly with the current software architecture, unless those apps use the official Onyx Boox SDK.


#7

I am sorry to inform you that this will not be available in the November firmware update. We do pay high attention to this problem and already worked on this.


#8

This is pretty close to what I had/have in mind, except just refresh screen after a couple of seconds of no stylus activity. It will require auto-disabling touch on stylus activity though.
Even just exposing the functionality for toggling raw stylus input mode in /sys/ is enough (a shell script to auto-toggle on stylus activity for certain apps is easy enough), but I couldn’t find it (theoretically, one can already accomplish this by making an appropriate service app, probably requiring root unless the SDK actually sends broadcast events every time stylus enters/exits screen).

Or just grab the apk and edit the smali like I did: https://youtu.be/3eHHH9-QHNk
Not sure if it’s visible or not, but I made it so that it pauses TouchHelper after 2 seconds of no activity to refresh the screen, so it’s not as wonky as you mentioned (if you’ve ever used stroke smoothing, you’re probably already used to it).
It actually doesn’t take that long to implement just the handwriting part. The hard part is figuring out where to insert things (easy if you attach a debugger).
The thing with doing it yourself and distributing hacks like this is that Onyx might no longer feel as compelled to make an effort (there are already existing solutions, why bother?), and other companies might just take it, rename it, and sell it off as their own (called BoyueNotes or something, assuming they use very similar SDKs).


#9

Wow, this is super interesting! I know very little about Android development specifically, I dabbled in it a while back when I was learning Java, but not enough to have tons of experience. My first instinct was to decompile an apk and see if I could implement the API for it. I didn’t even think of using a smali hack at all.

Do you have a Github or git repo with any of this in it? I totally get the hesitation about distributing hacks like this, which IMO is why it would be such a good idea for Onyx to open-source most to all of the software running on their devices. If you’re uncomfortable with sending me that, could you provide any tips where to start, if I want to try a similar project myself?


#10

No, but it’s pretty easy to come up with.

Just get an open source drawing app (I think I used Simple Draw), add in a few lines to create, resume, pause, refresh, and destroy TouchHelper, compile before and after the addition (test that the addition actually works on the device, you should see stroke width change after pausing and the screen refreshes), decompile both, and do a diff to see what to add (both the TouchHelper and the SDK itself). You should see, for example, for resume
invoke-virtual {v0, v1}, Lcom/onyx/android/sdk/pen/TouchHelper;->setRawDrawingEnabled(Z)Lcom/onyx/android/sdk/pen/TouchHelper;

Install the APK you want to mod in the emulator in Android Studio and use the debugger to find where in the smali to add TouchHelper.

Finally actually decompile the APK, add in the lines for TouchHelper, add the SDK, and recompile.


#11

Alright, thanks so much! If I have time I’ll try this out!


#12

In the meantime, could you give us a sense of when this feature will see some improvement? I’m not asking for anything astronomical, but this is one of the top issues for me at the moment. If this feature is years off into the future, that’s fine, but I’d like to know it’s years off.