Creating gaits, especially walking, trotting etc. is quite a challenging task, if you do this manually. It is therefore favorable to have a kinematic model of our cat. I wanted to show you my approach to create a new gait via Inverse Kinematics, where I'm using the IKPY library (https://github.com/Phylliade/ikpy).

Inverse kinematics at a glance is that you provide a point in space and the angles of the motors are calculated. In IKPY this is achieved by creating "chains" e.g. upper_arm--lower_arm--paw. Then you decide where to place the paw in space. Below you can see an image of Nybbles Kinematic model, to give you a first impression.

For the first tests I created an alternative walking gait that follows as swing-stance-pattern of this form:

```
# Walking pattern (1 == swing)
# [0,0,0,0,0, 0,0,0,0,0, 0,0,0,0,0, 1,1,1,1,1], # LL
# [0,0,0,0,0, 1,1,1,1,1, 0,0,0,0,0, 0,0,0,0,0], # RL
# [0,0,0,0,0, 0,0,0,0,0, 1,1,1,1,1, 0,0,0,0,0], # LA
# [1,1,1,1,1, 0,0,0,0,0, 0,0,0,0,0, 0,0,0,0,0], # RA
```

The longitudinal movement (blue) and the lift (orange) is based on sine/cosine-functions, that follow the pattern phase shift as shown above.

This pattern in combination with the IK-model will lead to this movement:

I also tried this new gait on Nybble for validation and it works quite well. The lift is 1 cm in amplitude, so Nybble can now step over a small obstacle.

Further updates and the Bittle-Version can be found here: __https://github.com/ger01d/kinematic-model-opencat__

So here's the code for the version above:

```
import ikpy
from ikpy.chain import Chain
from ikpy.link import OriginLink, URDFLink
import numpy as np
from numpy import sin, cos, pi
import matplotlib.pyplot as plt
import mpl_toolkits.mplot3d.axes3d as p3
deg2rad = pi/180
# Values in cm
armLength = 5
bodyLength = 12
bodyWidth = 10
distanceFloor = 7
stepLength = 5
swingHeight = 1
leanLeft = 0 # Not working, yet
leanRight = -leanLeft # Not working, yet
leanForward = 0 # cm
left_arm = Chain(name='left_arm', links=[
URDFLink(
name="center",
translation_vector=[leanForward, 0, distanceFloor],
orientation=[0, 0, 0],
rotation=[0, 0, 0],
),
URDFLink(
name="shoulder",
translation_vector=[0, bodyWidth/2, leanLeft],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
bounds=(0.1*deg2rad, 179*deg2rad),
),
URDFLink(
name="upperArm",
translation_vector=[armLength, 0, 0 ],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
bounds=(-179*deg2rad, -0.1*deg2rad),
),
URDFLink(
name="lowerArm",
translation_vector=[armLength, 0, 0],
orientation=[0, 0, 0],
rotation=[0, 0, 0],
)
])
right_arm = Chain(name='right_arm', links=[
URDFLink(
name="center",
translation_vector=[leanForward, 0, distanceFloor],
orientation=[0, 0, 0],
rotation=[0, 0, 0],
),
URDFLink(
name="shoulder",
translation_vector=[0, -bodyWidth/2, leanRight],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
bounds=(0.1*deg2rad, 179*deg2rad),
),
URDFLink(
name="upperArm",
translation_vector=[armLength, 0, 0],
orientation=[0,0,0],
rotation=[0,1,0],
bounds=(-179*deg2rad, -1*deg2rad),
),
URDFLink(
name="lowerArm",
translation_vector=[armLength, 0, 0 ],
orientation=[0,0,0],
rotation=[0, 1, 0],
)
])
left_leg = Chain(name='left_leg', links=[
URDFLink(
name="center",
translation_vector=[leanForward, 0, distanceFloor],
orientation=[0, 0, 0],
rotation=[0, 0, 0],
),
URDFLink(
name="butt",
translation_vector=[-bodyLength, 0, 0],
orientation=[0, 0, 0],
rotation=[0, 0, 0],
),
URDFLink(
name="hip",
translation_vector=[0, bodyWidth/2, leanLeft],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
bounds=(0.1*deg2rad, 100*deg2rad),
),
URDFLink(
name="upperLeg",
translation_vector=[armLength, 0, 0 ],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
bounds=(1*deg2rad, 179*deg2rad),
),
URDFLink(
name="lowerLeg",
translation_vector=[armLength, 0, 0 ],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
)
])
right_leg = Chain(name='right_leg', links=[
URDFLink(
name="center",
translation_vector=[leanForward, 0, distanceFloor],
orientation=[0, 0, 0],
rotation=[0, 0, 0],
),
URDFLink(
name="butt",
translation_vector=[-bodyLength, 0, 0],
orientation=[0, 0, 0],
rotation=[0, 0, 0],
),
URDFLink(
name="hip",
translation_vector=[ 0, -bodyWidth/2, leanRight],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
bounds=(0.1*deg2rad, 100*deg2rad),
),
URDFLink(
name="upperLeg",
translation_vector=[armLength, 0, 0 ],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
bounds=(1*deg2rad, 179*deg2rad),
),
URDFLink(
name="lowerLeg",
translation_vector=[armLength, 0, 0],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
)
])
def buildGait(frames):
frame = np.arange(0,frames)
swingEnd = pi/2
# longitudinalMovement
swing = -cos(2*(frame*2*pi/frames))
stance = cos(2/3*(frame*2*pi/frames-swingEnd))
swingSlice = np.less_equal(frame,swingEnd/(2*pi/frames))
stanceSlice = np.invert(swingSlice)
longitudinalMovement = np.concatenate((swing[swingSlice], stance[stanceSlice]))
longitudinalMovement = np.concatenate((longitudinalMovement,longitudinalMovement,longitudinalMovement, longitudinalMovement))
# verticalMovement
lift = sin(2*(frame*2*pi/frames))
liftSlice = swingSlice
verticalMovement = np.concatenate((lift[liftSlice], np.zeros(np.count_nonzero(stanceSlice))))
verticalMovement = np.concatenate((verticalMovement, verticalMovement, verticalMovement, verticalMovement))
return longitudinalMovement, verticalMovement
frames = 43
longitudinalMovement, verticalMovement = buildGait(frames)
# Walking pattern
# [0,0,0,0,0, 0,0,0,0,0, 0,0,0,0,0, 1,1,1,1,1], # LL
# [0,0,0,0,0, 1,1,1,1,1, 0,0,0,0,0, 0,0,0,0,0], # RL
# [0,0,0,0,0, 0,0,0,0,0, 1,1,1,1,1, 0,0,0,0,0], # LA
# [1,1,1,1,1, 0,0,0,0,0, 0,0,0,0,0, 0,0,0,0,0], # RA
# Phase shift between arms/legs, [RA, RL, LA, LL]
shiftFrame = np.round(np.array([0, pi/2, pi, 3*pi/2])/2/pi*frames)
shiftFrame = shiftFrame.astype(int)
for frame in range(0, frames):
right_arm_angles = 180/pi*right_arm.inverse_kinematics(target_position = np.array([stepLength*longitudinalMovement[frame], -bodyWidth/2 ,swingHeight*verticalMovement[frame]]))
right_arm_correction = np.array([0, -90, 90, 0])
right_arm_angles = np.round(right_arm_angles+right_arm_correction)
right_arm_angles = np.delete(right_arm_angles, np.s_[0,3], axis=0)
right_arm_angles = right_arm_angles.astype(int)
right_leg_angles = 180/pi*right_leg.inverse_kinematics(target_position = np.array([-bodyLength+stepLength*longitudinalMovement[frame+shiftFrame[1]], -bodyWidth/2 , swingHeight*verticalMovement[frame+shiftFrame[1]]]))
right_leg_correction = np.array([0, 0, -90, -90, 0])
right_leg_angles = np.round(right_leg_angles+right_leg_correction)
right_leg_angles = np.delete(right_leg_angles, np.s_[0,1,4], axis=0)
right_leg_angles = right_leg_angles.astype(int)
left_arm_angles = 180/pi*left_arm.inverse_kinematics(target_position = np.array([stepLength*longitudinalMovement[frame+shiftFrame[2]], +bodyWidth/2 , swingHeight*verticalMovement[frame+shiftFrame[2]]]))
left_arm_correction = np.array([0, -90, 90, 0])
left_arm_angles = np.round(left_arm_angles+left_arm_correction)
left_arm_angles = np.delete(left_arm_angles, np.s_[0,3], axis=0)
left_arm_angles = left_arm_angles.astype(int)
left_leg_angles = 180/pi*left_leg.inverse_kinematics(target_position = np.array([-bodyLength+stepLength*longitudinalMovement[frame+shiftFrame[3]], +bodyWidth/2 , swingHeight*verticalMovement[frame+shiftFrame[3]]]))
left_leg_correction = np.array([0, 0, -90, -90, 0])
left_leg_angles = np.round(left_leg_angles+left_leg_correction)
left_leg_angles = np.delete(left_leg_angles, np.s_[0,1,4],axis=0)
left_leg_angles = left_leg_angles.astype(int)
# Writing sequence to file
gait_sequence = np.concatenate((left_arm_angles, right_arm_angles, right_leg_angles, left_leg_angles))
print(frame," ",gait_sequence)
f = open("gait_sequence.csv", "a")
f.write("#")
np.savetxt(f, gait_sequence[[0,2,4,6,1,3,5,7]], fmt='%3.1i', delimiter=',', newline=", ")
f.write("+")
f.write("\n")
f.close()
# Create plot and image for each frame
fig = plt.figure()
ax = p3.Axes3D(fig)
ax.set_box_aspect([3, 3/2,1])
ax.set_xlim3d([-20, 10])
ax.set_xlabel('X')
ax.set_ylim3d([-10, 10])
ax.set_ylabel('Y')
ax.set_zlim3d([0.0, 10])
ax.set_zlabel('Z')
right_arm.plot(right_arm.inverse_kinematics(target_position = np.array([stepLength*longitudinalMovement[frame], -bodyWidth/2 ,swingHeight*verticalMovement[frame]])), ax)
right_leg.plot(right_leg.inverse_kinematics(target_position = np.array([-bodyLength+stepLength*longitudinalMovement[frame+shiftFrame[1]], -bodyWidth/2 , swingHeight*verticalMovement[frame+shiftFrame[1]]])), ax)
left_arm.plot(left_arm.inverse_kinematics(target_position = np.array([stepLength*longitudinalMovement[frame+shiftFrame[2]], +bodyWidth/2 , swingHeight*verticalMovement[frame+shiftFrame[2]]])), ax)
left_leg.plot(left_leg.inverse_kinematics(target_position = np.array([-bodyLength+stepLength*longitudinalMovement[frame+shiftFrame[3]], +bodyWidth/2 , swingHeight*verticalMovement[frame+shiftFrame[3]]])), ax)
figureName = "Nybble_" + str(frame)
plt.savefig(figureName)
#plt.show()
```

Nice work Gero, I am very new here, I just started to know and learn Arduino a couple of months ago, and I found this project and your kinematic tool, very interesting and useful. I just noticed that in your code, you placed four times of longitudinalMovement and verticalMovement into list, as the gait pattern for each leg/arm only starts from 0,1/4,/1/2,3/4 frames, to put two times of the original longitudinalMovement\verticalMovement is enough.

Jason

aijnec@163.comI created a python program to see how Bittle behaves. This will not work for studying his gait. Also, it is not a very smart program because I made it by piecing together various sample files.

The program reads a file with the behavior converted to csv.

The yellow plate shown in the graph is Bittle's new ground. The red arrows are the normal vectors. I really want to modify the Bittle's position to match the new ground, but I don't know how.

The result is also output as a gif animation file.

For some reason, the csv file could not be uploaded, so please rename action.txt to action.csv.

There is a run error:

Python 3.6.8

ikpy 3.2.2

Traceback (most recent call last):

File "kinematics_nybble.py", line 29, in <module>

rotation=[0, 0, 0],

TypeError: __init__() got an unexpected keyword argument 'translation_vector'

Great work! I came across that same article a few days ago when I was looking for literature to understand IK and how the functions are derived. At the time I wondered how I could adapt the code to achieve or verify the same foot trajectory, or any other shape to quickly experiment with new gaits. Please keep posting your progress.

@Gero very good attempt. I was interested to see the feet trajectory of your work, thus I mapped the horizontal longitudinal movement and the vertical movement. As expected, it is a simple sine curve with flat base:

Reversely, based on your derived gait numbers, I applied "forward kinematics" to verify that I can also work backwards to arrive at the same trajectory without using your original sine / cosine equations:

The reason I'm keen to dig deeper, is that I noticed in the original Bittle InstinctBittle.h, the gait numbers provided tend to result in Bittle dragging its feet when it walks (thus @Rongzhong Li's recommendation of running Bittle without socks, which doesn't make sense in normal circumstances). The feet doesn't seem to "push" Bittle up when it touches the ground, resulting in unstable Bittle movement. As Li did not disclose the original equations of how the numbers in InstinctBittle.h were derived, by applying the same forward kinematics approach, I mapped out the original Bittle's walk, run, crawl and trot feet trajectory:

Similar to your work, Li's given gait numbers all have a "flat" trajectory base, i.e. Bittle does not "push" itself up when the feet touches the ground, resulting in unstable forward-moving gestures. In a separate quadruped project, I came across this paper of a more sophisticated feet trajectory:

As one can see, during the stance phase, the feet trajectory is slightly convex downwards, which has a number of beneficiaries according to the authors. As I am still awaiting a few parts coming from eBay on my other project, I would work on applying this revised trajectory on Bittle, and see if it would result in better movement.

It depend of what you want to do with your robot, I will try to do tutorials about automation which is more suitable for easy tutorials and GUI tools, I already used some of Codecraft tools (I think it's them) with micro:bit or meowbit. A funny exemple that I have already seen here is to make the robot react depending on whether you are wearing a mask or not and there are many other fun possibilities ! It's true that adding movements or postures from the community can be made easier with GUI tools but generate movements is necessarily linked to a more complicated procedure... however it's a small part of the fun you can have with your robot ! OpenCat is made as an education tools for everyone I'm agree so... from noobs to nerds ! It's cool there are different levels from which start learning ! Everyone win ! I probably wouldn't have bought the robot if we couldn't do advanced things (at my level GUI tools are boring...) and yet, I will try to help noobs to have fun ! (but sadly time is limited... 😩) To finish, from what I know of software engineering I strongly doubt that Codecraft someday release an easy tool to make inverse kinematics.

In fact it is too cumbersome to work with formulas and therefore a graphic software could be more suitable and make the Bittle and the cat products more attractive, I think the Codecraft is a good starting point to be able to generate postures and behaviors and for this reason you should intervene at the Codecraft company so that they solve the problems related to the PREVIEW button that does not work and also to the data sequences during the insertion of the movements when trying to create a Skill. Only you @Rongzhong Li can solicit bug fixes it is in your company's interest above all to make the product user friendly and suitable for everyone

I summarize below what I understand

hoping to make a point about the situation:

1) You made a program by implementing Python's IKpy libraries.

2) In your program he entered some data of a hypothetical walking pattern and projected everything in the graph (the one with the orange and blue lines).

3) The graph is unfortunately static, the data can only be entered in the program which consequently generates the graph and therefore, when it is closed, a .csv file with the values relating to the hypothesized movement.

From my point of view it would instead take a 3D plane such as one of the images generated by the program and through the GUI interface implement something more dynamic in order to interact with the image itself being able to move the object (Bittle) on the horizontal plane (unfortunately it does not fly), generating a new position and even more important to set new positions by hooking a limb (shoulder or knee) with the mouse and dragging it to a new position generating defined sequences (frames) and at that point exporting the sequence of data to import into scketch instinct.h

I looked around and found a couple of Links that I will see to see if I can recover something I post them here but it would be interesting if a good graphic programmer would give us a hand as I don't think it is difficult to create a software even with the help of a spreadsheet integrated by VBA or a different high-level language. Meanwhile, the links are the following

http://gazebosim.org/https://robodk.com/doc/it/Basic-Guide.html#StartWe need something like instead