Xiaoan (Sean) Liu

Research Intern @Google

I research future interfaces between

humans, things, and AI agents.

I'm making contextual intelligence happen outside of traditional 2D displays.

I call it Reality Computing.

I've been exploring these questions from my undergrad to my master's:

How do intelligence (human and AI) perceive?

How do they express themselves outside of the 2D screens?

How does intelligent computing happen in Reality?

Reality is my playground:
I've been exploring the above questions by designing, hacking, and building.

I'm answering this question by designing, hacking and building.


For a better viewing experience, pls open this website on a larger screen or click this link to watch.

Reality Proxy [Manuscript in preparation]

What if interacting with physical objects in MR were as simple and precise as managing their digital proxies?

Reality Proxy [Manuscript in preparation]

What if interacting with physical objects in MR were as simple and precise as managing their digital proxies?

Reality Proxy [Manuscript in preparation]

What if interacting with physical objects in MR were as simple and precise as managing their digital proxies?

RealiTips [Manuscript in preparation]

An assistant who can see your desktop and guide you through making a coffee step by step.

RealiTips [Manuscript in preparation]

An assistant who can see your desktop and guide you through making a coffee step by step.

RealiTips [Manuscript in preparation]

An assistant who can see your desktop and guide you through making a coffee step by step.

Fusion Reality

It's basically a JARVIS.

Fusion Reality

It's basically a JARVIS.

Fusion Reality

It's basically a JARVIS.

Eagle Vision [Work In Progress] - See beyond the physical limitations.

Collaborative XR (with Ken Perlin)

We developed a system for an XR classroom where everyone can learn computer graphics in an XR environment.

Collaborative XR (with Ken Perlin)

We developed a system for an XR classroom where everyone can learn computer graphics in an XR environment.

Collaborative XR (with Ken Perlin)

We developed a system for an XR classroom where everyone can learn computer graphics in an XR environment.

Programmable Reality (an imaginary vision)

Why use Unity when you can directly program the behaviors of objects in the scene?

Programmable Reality (an imaginary vision)

Why use Unity when you can directly program the behaviors of objects in the scene?

Programmable Reality (an imaginary vision)

Why use Unity when you can directly program the behaviors of objects in the scene?

Some explorations for technical pipelines (such as image generation and pose tracking in AR camera):

Publication:

Keru Wang, Pincun Liu, Yushen Hu, Xiaoan Liu, Zhu Wang, Ken Perlin

Ye Jin, Xiaoxi Shen, Huiling Peng, Xiaoan Liu, Jingli Qin, Jiayang Li, Jintao Xie, Peizhong Gao, Guyue Zhou, Jiangtao Gong

Works during my undergrad:
Works during my undergrad: