{"payload":{"feedbackUrl":"https://github.com/orgs/community/discussions/53140","repo":{"id":756646135,"defaultBranch":"master","name":"FRC2024","ownerLogin":"Team7520","currentUserCanPush":false,"isFork":true,"isEmpty":false,"createdAt":"2024-02-13T02:43:26.000Z","ownerAvatar":"https://avatars.githubusercontent.com/u/47093277?v=4","public":true,"private":false,"isOrgOwned":true},"refInfo":{"name":"","listCacheKey":"v0:1724212384.0","currentOid":""},"activityList":{"items":[{"before":"51add4930b92b2616e533647d8a296893bdbae8b","after":"944a4cb2beaf51a7232f4cdcfa7abb3b40ed1fc0","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-25T07:10:43.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Tested: PATH COMMAND CHAINING, Note Distance Decreased\n\nAfter some hard work, we can finally chain OTF paths with other commands. The thing with OTF paths is that the are explicitly scheduled, which interrupts commands calling it. For example, putting the scheduler inside InstantCommand actually exits the command as interrupted, and like wise when putting scheduler inside a normal command type .java file, end() method is called because interrupt is true. As such, there really is no way to convert OTF path into a command, except to have event markers toggle boolean variables and let Robot Container receive triggers. Chaining here is done such that two booleans, one for determining when the path ends, and another for determining if NotePickUp was called, are factors to a trigger. When NotePickUp is true and pathActive is false (aka ended), then it triggers other commands (and this can recursively call NotePickUp again). This is the only way we can chain paths to other commands.\n\nSince path planner is all about the robot center, the robot will move to the center of a detected note - meaning it will run over on top the note's location. The issue with this is that when picking up notes close to barriers or walls, the robot will definitely crash its intake. If the robot attempts to lift the intake when it is stuck, it will slip and damage the intake, making it unusable. To avoid this, we'll need to make sure that the robot DOES NOT fully run over on top the note, but rather the intake should. The intake to the center of the robot is roughly 0.5m, so we'll reduce the X distance of the note (forward distance relative to robot). When approaching notes directly in front, the robot tends to lean to the right by 15 cm, so I brought back the YOFFSET. Now, forward notes have no problem being picked up, but notes that span to the sides of the view are basically guaranteed to miss due to the robot not being able to reach the note. The solution for this is simply to turn first, then run the path separately. Additionally, for consistency reasons, we've reduced the maximum note distance acceptable for detection down to 2.5 meters rather than 3.\n\nShooter speed seems to peak at 70%.\n\nGear ratio for drive motors are some how back to being 6.7...\n\nPaths now have three categories for pick up notes: really close means the robot will retreat before picking up; relatively close means the robot will approach the note slower, and normal distance is normal speed. Maximum acceleration and velocity decreased to 1.","shortMessageHtmlLink":"Tested: PATH COMMAND CHAINING, Note Distance Decreased"}},{"before":"97dfde8f5b6b97d812855ab25e8ee5cf486b4668","after":"51add4930b92b2616e533647d8a296893bdbae8b","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-25T03:48:31.000Z","pushType":"push","commitsCount":4,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Revert \"Not Tested: OTF Path Treated Like Actual Command\"\n\nThis reverts commit 0de390a4ba73f8c12457ba57adff17a3d3cacc6b.","shortMessageHtmlLink":"Revert \"Not Tested: OTF Path Treated Like Actual Command\""}},{"before":"8d0b6b400ac3cf82e8644e724ffd7098e2465989","after":"97dfde8f5b6b97d812855ab25e8ee5cf486b4668","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-22T06:58:59.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Note Detected LED Indication and Path Overriding, Unresolved Note PickUp Scenarios\n\nWhen a valid note is detected (in bound, or has a score > 0, or the intake is free), the LEDs will turn yellow. For discrepancy, idle colour is now blue instead of rainbow. Intaking a note still shows green. This can be updated such that the closer the robot is to the note, the warmer the colour of the LED, though this is not a high priority change.\n\nDrivers can now input any joystick direction to stop and override paths in case of emergency. The override works by scheduling another path command while the robot is still completing the original path. The newly scheduled path is very short, and quickly returns control back to the joysticks upon completion. Overriding will return the intake to the rest position automatically.\n\nNew custom poses were added to map, though not tested for accuracy.\n\nCamera transformation FROM robot slightly altered to accommodate for 2d printed casing.\n\nFor some reason, odometry was off again and I found a new number for gear ratio.\n\nUNRESOLVED: Before, there were two scenarios where the note auto could not consistently pick up notes - right in front of robot, and extremely to the sides outside camera FOV.\nNormally, if the intake is already down and moving towards a note, the note will be pushed up into the intake by the bottom set of wheels. However, when the note is directly in front of the bumper, the intake lands on top of the note and is pushed away rather than consumed. To resolve this, I initially had the robot retreat. In a 3-pose path, the robot would drop the intake when retreating, and scoop the intake when approaching, thus resolving the problem. However, when I was testing, this path had a lot of errors (likely because of the logic in the path I was using) where it became very inconsistent to pick up notes on the sides. When removing the retreat, the path had no issue picking up side notes, but the retreat path could not. There are simple fixes to this issue, such as more testing and removing bezier curves, or turning it into two step paths - but these will require further effort. As for now, the mitigation is simply using a slower acceleration and velocity, praying notes are picked up.\nFor notes on the very side which exit the camera FOV, I initially thought a \"miss\" would push the note into the view of the camera again, but this is simply not always true. Attempting to resolve this produced two problems- one, a \"searching\" command would need to be in place after a miss. For example, it would turn 360 until it found a note, or it should retreat until notes enter its FOV. The second problem is actually figuring out how to run different paths after each other like commands. The problem with paths is that their command calls are instant, but the motion of the robot finishing the path ends way later. Paths do not tell you when they are complete, or interrupted, so we would need to modify boolean variables as event markers to notify following commands. And, command scheduling or creating orders is not always as simple as it appears... Indeed, with enough time to test and debug this would be no problem, but it was not considered to spend lots of time \"perfecting\" or engineering a product in the schedule because the end goal was simply set to \"get this usable for CNE\" and not \"make it consistent for CNE\".\n\nThis is just after thought, but end goals determine how you lay out schedules and tasks. With such a vague goal and for CNE, as well as being considerate for time, I made a schedule which only got the basic features working, usable indeed but not always consistent. In terms of \"engineering\" and constant testing or debugging, the schedule I made only moderately considered time for that. However, errors from early on in design not resolved via testing are bound to affect future progress someway or another. Doing sufficient testing, ESPECIALLY WITH RECORDED DATA, is one of the ways to analyze and thus resolve problems to the best of our ability, ensuring the best foundation for the next task. Half-assing tasks now will half-ass everything, and definitely depends on just how \"engineered\" you want to make the project. Lastly, with proper managing and planning skills, SMART goals can be achieved smoothly - that includes making sure team members are working to the best of their ability. Being a manager is much harder than being a worker because you are managing the work of more than just yourself. If you don't put a ton of effort into leading a project, then you can't expect a ton effort from team members in return. Running into problems, falling behind schedule, things not going smoothly, and times of wanting to give up sometimes bring me debates, and often a reflection on things I could have done better or changed. In tight times like these, its important to view your actions from a third person perspective, with logic, and no bias of emotion. A long day at the club can get you fed up, but always remember to keep your cool if you truly want to progress a project to the best of your ability.","shortMessageHtmlLink":"Note Detected LED Indication and Path Overriding, Unresolved Note Pic…"}},{"before":"bbc317418e2d1c21de7be1fd1b1252d6b8b82837","after":"8d0b6b400ac3cf82e8644e724ffd7098e2465989","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-21T04:26:51.000Z","pushType":"push","commitsCount":2,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Merge branch 'CNE-2024-AUTONOMOUS-PROJECT' of https://github.com/Team7520/FRC2024 into CNE-2024-AUTONOMOUS-PROJECT","shortMessageHtmlLink":"Merge branch 'CNE-2024-AUTONOMOUS-PROJECT' of https://github.com/Team…"}},{"before":null,"after":"5d6d393a7a5aa6d6a631f38d52a5ce069555b0a2","ref":"refs/heads/cne-24-robin","pushedAt":"2024-08-21T03:53:04.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Not Tested: Path Override, LED Signal For Detection, Second Try PickUp When Necessary\n\nWhen joysticks are moved while a path is in session, a new path is created (that barely moves) to override the current.\n\nWhen a note is detected, the LED on the robot will turn yellow. Drivers can use this as a signal to know whether the TPU is working or not.\n\nDriver station also now prints a boolean representing whether notes are being detected and the photon camera is on.\n\nNotePickUp command now takes in a boolean to determine whether the note pick up auto should try again if a note was missed.\n\nnote pick up auto now has a backward bezier tangent to prevent the robot from pushing away the note before it picks it up.","shortMessageHtmlLink":"Not Tested: Path Override, LED Signal For Detection, Second Try PickU…"}},{"before":"8ba1f59d1ee1bcbf3addc44e78949f0988c930bc","after":"bbc317418e2d1c21de7be1fd1b1252d6b8b82837","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-21T03:35:22.000Z","pushType":"push","commitsCount":2,"pusher":{"login":"SeanL1234","name":null,"path":"/SeanL1234","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/158243448?s=80&v=4"},"commit":{"message":"minor change fix","shortMessageHtmlLink":"minor change fix"}},{"before":"3b980862437844e392cfcc8d15d185d83a077780","after":"8ba1f59d1ee1bcbf3addc44e78949f0988c930bc","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-15T00:35:47.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"NEW DISCOVERY -> Check details, Small Changes\n\nMoved speaker shooting position buttons to Dpad instead of ABXY. Top is center, and right and left are for source-side and amp-side shooting positions - this depends on the your alliance colour. If you are on the red alliance, your rightside button is amp side, where as on the blue alliance your rightside button is the source. This is to match button direction to the perspective of the driver.\n\nSlightly changed some coordinates for custome positions on map.\n\nUpdated AbsoluteDrive so that when using the right and left bumpers for driver controller (for slow turning), also calibrated to absolute coordinates.\n\nHardware wise, vacuumed the Nav X, reworked usb C power cables for pi (because raspberry pi keeps on getting voltage drops), moved orange pi power cable to the VRM which contains the ethernet switch power so that raspberry pi can have one VRM to itself (to reduce voltage drop by another device)\n\nNEW DISCOVERY: Previously, I thought path planner only returned a destination coordinate to the robot, and the robot would try to go to that coordinate, which was relative to its starting position. But when you think about it, the robot only knows how far and how to get to that destination via its odometry, and if we are updating its odometry by calibrating it through april tag, then the distance to the final destination is also updated while the path motion is in session. In other words, the robot will realize that the planned destination is a little farther than it initially calculated, so then it speeds up to reach that updated destination within the time limit that it set for itself to reach there. Again, to reword that, lets say the robot thought it traveled 90% of the path already, but then realizes that the final destination is another meter ahead, effectively making its current position only instead of 90. So the robot will decide to accelerate such that it will get to 100% from 80% in the same time it was supposed to get from 90% to 100% in the initially planned path.\nThis means that the robot auto calibrates its final destination in a path if april tags are available, thus reducing the need for two-step paths. Every motion can be one-step, from A to B and not A to C then C to B.\nThe acceleration of the robot can sometimes add extra velocity to notes that it shoots out, or mess up timing of commands. This obviously needs further testing.","shortMessageHtmlLink":"NEW DISCOVERY -> Check details, Small Changes"}},{"before":"8eba354365036a1d7ce2bc8de07eef442a549bba","after":"3b980862437844e392cfcc8d15d185d83a077780","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-15T00:33:33.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"NEW DISCOVERY -> Check details, Small Changes\n\nMoved speaker shooting position buttons to Dpad instead of ABXY. Top is center, and right and left are for source-side and amp-side shooting positions - this depends on the your alliance colour. If you are on the red alliance, your rightside button is amp side, where as on the blue alliance your rightside button is the source. This is to match button direction to the perspective of the driver.\n\nSlightly changed some coordinates for custome positions on map.\n\nUpdated AbsoluteDrive so that when using the right and left bumpers for driver controller (for slow turning), also calibrated to absolute coordinates.\n\nNEW DISCOVERY: Previously, I thought path planner only returned a destination coordinate to the robot, and the robot would try to go to that coordinate, which was relative to its starting position. But when you think about it, the robot only knows how far and how to get to that destination via its odometry, and if we are updating its odometry by calibrating it through april tag, then the distance to the final destination is also updated while the path motion is in session. In other words, the robot will realize that the planned destination is a little farther than it initially calculated, so then it speeds up to reach that updated destination within the time limit that it set for itself to reach there. Again, to reword that, lets say the robot thought it traveled 90% of the path already, but then realizes that the final destination is another meter ahead, effectively making its current position only instead of 90. So the robot will decide to accelerate such that it will get to 100% from 80% in the same time it was supposed to get from 90% to 100% in the initially planned path.\nThis means that the robot auto calibrates its final destination in a path if april tags are available, thus reducing the need for two-step paths. Every motion can be one-step, from A to B and not A to C then C to B.\nThe acceleration of the robot can sometimes add extra velocity to notes that it shoots out, or mess up timing of commands. This obviously needs further testing.","shortMessageHtmlLink":"NEW DISCOVERY -> Check details, Small Changes"}},{"before":"6a3086c5111e603d1086aeda9b688f259845ded5","after":"8eba354365036a1d7ce2bc8de07eef442a549bba","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-14T00:23:44.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Ethernet Switch Applied! One Cycle With Only 2 Buttons!\n\nThe ethernet switch is 12v/500mA, and now with two pi's using 5v/2A on the same VRM, the VRM struggled to supply sufficient voltage (5V) to the raspberry pi (the orange pi probably didn't find the voltage drop too impacting). As such I added another VRM to specifically supply the ethernet switch.\n\nTwo issues occurred: the original rasp pi integrated sd card reader stopped working for some reason. So I stole another pi from another bot and all is good. Second, the nav X tends to disconnect from time to time. It starts going crazy and should be emergency stopped if so. Should re-secure mounting...","shortMessageHtmlLink":"Ethernet Switch Applied! One Cycle With Only 2 Buttons!"}},{"before":"e30985d045ef3a99e35d7f43101353d26c0ff11d","after":"6a3086c5111e603d1086aeda9b688f259845ded5","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-13T01:32:45.000Z","pushType":"push","commitsCount":2,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Tested: Odometry Finally Accurate, Shooting OTF Path Basic Feature Completed!\n\nOdometry was previously off because gear ratio constant number was used for SWERVE2, which is where this branch stemmed from. Changing back to 6.7, odometry is only ~5cm off for every meter.\n\nAdditionally merged customPath usage and SophisticatedPath together. Sophisticated path now requires an end position, and 2 rotation2d objects representing the Bezier curve tangents. This is to organize and make use of the Pose2d elements more. Only mode 0 which is note detection does not actually use the parameters given to it because the note location is from TpuSystem, not RobotContainer.\n\nUsing custom positions from map and sophisticated path, robot can now accurately and consistently reach speaker to shoot (using automatic shooting sequence), within 2-3 meters distance.\n\nWith the now refined odometry, automatic note pick now no longer over runs the note - it's more on spot now. Put the rotation target to note earlier in path to ensure robot can still pick up notes beside itself.\n\nOne more thing! Do not press the auto shooter button if you are not calibrated to an april tag and are not using the correct alliance colour! Since coordinates are now absolute, auto-ing for speaker might cause to robot to run out of the room!","shortMessageHtmlLink":"Tested: Odometry Finally Accurate, Shooting OTF Path Basic Feature Co…"}},{"before":"977a2e4518ddae23cc31d7af4672f9e6f0571942","after":"e30985d045ef3a99e35d7f43101353d26c0ff11d","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-12T23:05:03.000Z","pushType":"push","commitsCount":2,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Tested: Now Using ABSOLUTE Coordinate System\n\nOdometry is now officially absolute, and very important. Previously, each boot up would define that robot position as x = 0, y = 0, and angle = 0. Based off of that position, all further motion would be field-relative relative to that position. That is, when starting from the blue alliance, that would be set as origin, and starting from red alliance would set red as origin. Now that april tag detection returns ABSOLUTE coordinates, using the red alliance's source as (0,0), the robot is also under an absolute coordinate system. Blue alliance wall is considered x = 0, and red alliance wall is considered x = 16.5 (meters). As a result, using field-relative motion relative to the starting position of the robot is no longer appropriate.\n\nBy determining the alliance colour from driver station, the driver's joystick controls will either be flipped or not to match their field perspective. This way, the driver's front will always be the forward direction of the robot without interrupting odometry calculations.\n\nIf photonvision is not being used or no april tags are detected within range, booting up the robot will still set the position as (0,0) and motion will be field-relative relative to that origin. However, if an april tag is detected, the robot odometry and heading will all be updated to ABS, assuming the april tag is at it's official location on a field.\n\nEstimated some custom speaker positions - not tested.\n\nRemoved resetting odometry button - odometry reset while using absolute FRC coordinates will mess up a lot of things.\n\nSlightly sped up driving speed cuz I was getting annoyed of how slow bot was.","shortMessageHtmlLink":"Tested: Now Using ABSOLUTE Coordinate System"}},{"before":"a1e7548242557da56490a41aba8fb8dc0427b311","after":"977a2e4518ddae23cc31d7af4672f9e6f0571942","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-11T23:49:26.000Z","pushType":"push","commitsCount":2,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Updating Odometry Using Tags, Timed Shooter Sequence\n\nShooter Sequence is now available. The ShooterSequence Command is called using a trigger, reading a boolean value. When the path nears the end, it changes the value of the boolean and causes the command to run. The ShooterSequence Command cannot be run as an event in the path because the end of the path completely terminates all commands - even when the shooter has just started revving. Additionally, the sequence command could not be run as sequential because the end of the path command is not the end of the robot's motion. Currently, the sequence is programmed to run at 70% of the path's completion. The shooting sequence, ignoring the path, runs for just over 1 second.\n\nAprilTags can now update the robot's odometry. There is an issue where the estimated coordinates in meters do not physically match real world meters; this might be because swapping to Neos changed the base's gear ration and other constant properties required for prediction, but at the same time that should only affect odometry estimation, not value retrieved from april tags. If the coordinates estimated from april tag is not perfectly accurate (april tag coordinates and robot odometry are usually are 10cm LESS for every 1m travelled), then that would mean the given dimensions and angles for the robot's transformation3d relative to the camera (check AprilTagSystem class method) may be incorrect. THIS REQUIRES MORE TESTING, THERE IS NOT ENOUGH INFORMATION TO CONCLUDE THE IMPACT ON ODOMETRY ACCURACY AS OF YET.\nAside from that, the general coordinates and heading, especially heading, are pretty accurately calculated (heading is in sync with robot). April tags are only considered when they are within 2 meters of the robot for less chances of invalid data being received. As of currently, without enough testing and space, it is not quite appropriate to run custom positions yet.\n\nNo changes have been made to the Map class as of yet.\n\nMade some minor changes to some classes.","shortMessageHtmlLink":"Updating Odometry Using Tags, Timed Shooter Sequence"}},{"before":"0aeeaee7bb510941968b0a5e7bab8268ec691303","after":"a1e7548242557da56490a41aba8fb8dc0427b311","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-11T03:54:54.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Not Tested: Map Object to be Repurposed, Optimized TargetDetection Class into AprilTagSystem, OTF Shooting Sequence\n\nThe Map object I was working on was meant to perform the important task of reading april tags to create a more accurate robot position - this is something the TargetDetection or now AprilTagSystem class already does. I \"merged\" some things I wrote from Map into ATS. Map class will be repurposed for custom poses.\n\nDrafted an automatic shoot sequence. Currently since Map is not available, there is no custom position. Runs ShootSequence Auto near end of path.\nUsing Sean's OTF path now renamed to SophisticatedOTFPath, you can change mode from note to shooter (supports more using other positions).","shortMessageHtmlLink":"Not Tested: Map Object to be Repurposed, Optimized TargetDetection Cl…"}},{"before":"ba8d6955f0bf6287f354c634c562bf6ac528ea93","after":"0aeeaee7bb510941968b0a5e7bab8268ec691303","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-11T03:24:56.000Z","pushType":"push","commitsCount":3,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Not Tested: Map Object to be Repurposed, Optimized TargetDetection Class into AprilTagSystem\n\nThe Map object I was working on was meant to perform the important task of reading april tags to create a more accurate robot position - this is something the TargetDetection or now AprilTagSystem class already does. I \"merged\" some things I wrote from Map into ATS. Map class will be repurposed for custom poses.","shortMessageHtmlLink":"Not Tested: Map Object to be Repurposed, Optimized TargetDetection Cl…"}},{"before":"590c5e565cdcd91f6cdfac64455be6d906d88722","after":"ba8d6955f0bf6287f354c634c562bf6ac528ea93","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-11T01:01:26.000Z","pushType":"push","commitsCount":2,"pusher":{"login":"SeanL1234","name":null,"path":"/SeanL1234","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/158243448?s=80&v=4"},"commit":{"message":"Merge branch 'CNE-2024-AUTONOMOUS-PROJECT' of https://github.com/Team7520/FRC2024 into CNE-2024-AUTONOMOUS-PROJECT","shortMessageHtmlLink":"Merge branch 'CNE-2024-AUTONOMOUS-PROJECT' of https://github.com/Team…"}},{"before":"348ac1744a310afceccf872b451a80e344339f77","after":"590c5e565cdcd91f6cdfac64455be6d906d88722","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-10T00:10:52.000Z","pushType":"push","commitsCount":2,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Tested: AUTOMATIC NOTE INTAKE COMPLETE! New Upgraded Automatic Intake Commands, Usage of OTF Path\n\nTwo new commands: SensorAutoIntake as the name implies drops and spins the intake - the command is finished when the light sensor detects a note; AutoNotePickUp is a Sequential Command which runs SensorAutoIntake first, then retrieves and stops the intake.\n\nAutoNotePickUp is turned into a NamedCommand for PathPlanner to call. Using Sean's upgraded path with event markers, AutoNotePickUp is called at the very beginning, and at the very end (in case no note is actually picked up) the intake is retrieved and stopped.\nAdditionally, using Rotation Targets allows the robot to face towards the direction of the note when approaching. This ensures high consistency when tracking notes unfavourably to the sides of the bot.\nThe updated OTF path takes into consideration what happens if no note is detected - when there are no notes or the bestNote is out of range, the robot will only move 10cm forward when the auto is enabled. This is to prevent unexpected and dangerous movement of the robot.\n\nLastly, added another OTF path method to prepare for April Tag implementation, and minor bug fixes.","shortMessageHtmlLink":"Tested: AUTOMATIC NOTE INTAKE COMPLETE! New Upgraded Automatic Intake…"}},{"before":"684178d1b3c9ad815cd2f8b1d9bdb263daf17aa1","after":"348ac1744a310afceccf872b451a80e344339f77","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-09T04:18:15.000Z","pushType":"push","commitsCount":2,"pusher":{"login":"SeanL1234","name":null,"path":"/SeanL1234","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/158243448?s=80&v=4"},"commit":{"message":"Merge commit '3e2e4b94829a4970539dc03fbba929821c2fb082' into CNE-2024-AUTONOMOUS-PROJECT","shortMessageHtmlLink":"Merge commit '3e2e4b94829a4970539dc03fbba929821c2fb082' into CNE-2024…"}},{"before":"755809291be14133abee35055fc59d052da52cfb","after":"684178d1b3c9ad815cd2f8b1d9bdb263daf17aa1","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-09T03:36:23.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Inverted intake speed back to (-) in Constants","shortMessageHtmlLink":"Inverted intake speed back to (-) in Constants"}},{"before":null,"after":"755809291be14133abee35055fc59d052da52cfb","ref":"refs/heads/CNE-2024-AUTONOMOUS-PROJECT","pushedAt":"2024-08-09T00:45:34.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"TESTED: Objected Oriented Approach for Note and TpuSystem Class - Working!\n\nThe object oriented approach for Notes is now working! The TpuSystem adds/removes/updates as intended when a detection is seen through camera. Using the compareScore method, the best note is supplied. Currently, the area weight factor is the more impactful one - confidence level is a very undependable factor because the AI model itself isn't very great at discerning differences in notes. For example, all orange or orange-red coloured objects are considered notes despite their size, and ringed items that aren't even orange are also considered notes. These fake-notes have high confidence levels, high enough to let you know it needs more training. However, the model is definitely smart enough to discern between a robot bumper and a note. There might be some things that may catch its attention, like perhaps an orange rubber sushi wheel or a thick power cable wrapped in a loop... So yeah, as of now the size has a massive impact on the so-called \"score\" of a note - this score determines whether the note in question is the best option to go for. These factors need some testing to see if they really work in a competition game or not. Without proper testing, this is probably the best we can get.","shortMessageHtmlLink":"TESTED: Objected Oriented Approach for Note and TpuSystem Class - Wor…"}},{"before":"28e77dfa22a61d85d0a2ff898937259347cd0074","after":"755809291be14133abee35055fc59d052da52cfb","ref":"refs/heads/Swerve2-Adaptation-OnTheFlyPath-Robin","pushedAt":"2024-08-08T03:00:34.000Z","pushType":"push","commitsCount":2,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"TESTED: Objected Oriented Approach for Note and TpuSystem Class - Working!\n\nThe object oriented approach for Notes is now working! The TpuSystem adds/removes/updates as intended when a detection is seen through camera. Using the compareScore method, the best note is supplied. Currently, the area weight factor is the more impactful one - confidence level is a very undependable factor because the AI model itself isn't very great at discerning differences in notes. For example, all orange or orange-red coloured objects are considered notes despite their size, and ringed items that aren't even orange are also considered notes. These fake-notes have high confidence levels, high enough to let you know it needs more training. However, the model is definitely smart enough to discern between a robot bumper and a note. There might be some things that may catch its attention, like perhaps an orange rubber sushi wheel or a thick power cable wrapped in a loop... So yeah, as of now the size has a massive impact on the so-called \"score\" of a note - this score determines whether the note in question is the best option to go for. These factors need some testing to see if they really work in a competition game or not. Without proper testing, this is probably the best we can get.","shortMessageHtmlLink":"TESTED: Objected Oriented Approach for Note and TpuSystem Class - Wor…"}},{"before":"39e9a96b0f169881d91533ebe5e1f4e04dfa4ad9","after":"28e77dfa22a61d85d0a2ff898937259347cd0074","ref":"refs/heads/Swerve2-Adaptation-OnTheFlyPath-Robin","pushedAt":"2024-08-05T19:24:00.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Updated LED Visual Feedback, Intake Can Now Eject In Intake Position\n\nBefore, there was an \"intaking\" LED colour to signal the driver when the operator was intaking, but this command was overriding the LED colour when it recieves a note, namely being Green, which is the more important visual feedback. I've removed the intaking colour to ensure that in any position, if a note has entered the intake, it will turn green. Code wise, very accurate. Another thing is before, in the intake position, both X and RightBumper sucked in. Updated so that RIghtBumper always ejects, no matter position.","shortMessageHtmlLink":"Updated LED Visual Feedback, Intake Can Now Eject In Intake Position"}},{"before":"c88fe7cd3f4f8ab3c8d8d55d0bcf4522e2238c3c","after":"39e9a96b0f169881d91533ebe5e1f4e04dfa4ad9","ref":"refs/heads/Swerve2-Adaptation-OnTheFlyPath-Robin","pushedAt":"2024-08-05T18:24:49.000Z","pushType":"push","commitsCount":2,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Tested OnTheFly with manual control of intake, 90% accuracy within 4 meters\n\nSlightly modified code structure of TPU System, still does not support multi note detection. Estimated note location is 80-90% accurate, no trouble arriving to note via on the fly within 4-5 meters. Added offset constants to improve accuracy, a YOffset and XOffset. Using the current OnTheFly speed and acceleration, the intake must be dropped before moving, and spinning of intake MUST STOP when the path comes to a stop - the note is squished to 1/2 its original size at that point.","shortMessageHtmlLink":"Tested OnTheFly with manual control of intake, 90% accuracy within 4 …"}},{"before":"51ead2b4c05c313baf11bba94be8a60a65304bb6","after":"c88fe7cd3f4f8ab3c8d8d55d0bcf4522e2238c3c","ref":"refs/heads/Swerve2-Adaptation-OnTheFlyPath-Robin","pushedAt":"2024-08-04T20:00:52.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Working Neo Shooters, TPU Information read off of networktables\n\nSwapped talonfx for Cansparkmax in code, neo shooters now spin. Added TPU system class to help recieve note detection information (currently set as MaxConfObj Topic). Current data includes x and y coordinates on stream, height and width of object, confidence level. NetworkTable Instance created in robotcontainer, topic passed to swerveSubsystem and to TPUSystem. NOTE: Read about topics and networktables from wpilib documentation.","shortMessageHtmlLink":"Working Neo Shooters, TPU Information read off of networktables"}},{"before":"18f04fa836e4881bfb2cb118aa74c4fe58bfd545","after":"51ead2b4c05c313baf11bba94be8a60a65304bb6","ref":"refs/heads/Swerve2-Adaptation-OnTheFlyPath-Robin","pushedAt":"2024-08-01T20:30:28.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Slowing Down Intake, Updated Code for Swerve3Neo, Shooters NOT WORKING\n\nThe intake spinners' gearbox ratio was reduced for testing purposes, so decreased the speed outputted by the motor. Additionally, replacement of kraken to neo requires same offset angles - created new json folder: Swerve3Neo. Switching kraken to neo on shooter, not working yet in code","shortMessageHtmlLink":"Slowing Down Intake, Updated Code for Swerve3Neo, Shooters NOT WORKING"}},{"before":"289d028fdaa5e769422f04b42b0292b23da8b12e","after":"18f04fa836e4881bfb2cb118aa74c4fe58bfd545","ref":"refs/heads/Swerve2-Adaptation-OnTheFlyPath-Robin","pushedAt":"2024-08-01T20:23:26.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Slowing Down Intake, Updated Code for Swerve3Neo\n\nThe intake spinners' gearbox ratio was reduced for testing purposes, so decreased the speed outputted by the motor. Additionally, replacement of kraken to neo requires same offset angles - created new json folder: Swerve3Neo","shortMessageHtmlLink":"Slowing Down Intake, Updated Code for Swerve3Neo"}},{"before":null,"after":"e29c3ac7ed5aa5be2db739b23a4990c846c7dd86","ref":"refs/heads/NEO","pushedAt":"2024-06-05T23:39:23.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"davidhuang68","name":null,"path":"/davidhuang68","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/155268199?s=80&v=4"},"commit":{"message":"always speed cutoff","shortMessageHtmlLink":"always speed cutoff"}},{"before":null,"after":"79aa5e4d8d8556919481d5cd2bc40b4ee972e574","ref":"refs/heads/newids","pushedAt":"2024-06-02T03:26:33.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"davidhuang68","name":null,"path":"/davidhuang68","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/155268199?s=80&v=4"},"commit":{"message":"changed ids","shortMessageHtmlLink":"changed ids"}},{"before":"3dc8f794c480a29862527b7ed55f44238dd08b52","after":"289d028fdaa5e769422f04b42b0292b23da8b12e","ref":"refs/heads/Swerve2-Adaptation-OnTheFlyPath-Robin","pushedAt":"2024-05-30T01:57:25.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"Consistent: Detect an april tag and use On-The-Fly path planning to move to it.\n\nFinally, after 3 hours, I found the solution. First, I discovered that if you use mathematical vectors to represent the path with WPILIB's translation2d object, I can easily calculate relative component vectors into absolute component vectors. Previously, I used simple trigonometry with the robots heading as angle to convert components, but this ran into an issue where trig ratios change signs (+/-) in different quadrants, and that the heading goes form +-180 instead of 0 to 360. Lots of sign changing is introduced, and this factor sometimes causes the robot to move in outright wrong paths. However, more time was spent on actually trying to find out why my code (which I didn't touch at all) just wouldn't work, and I hypothesized that it had something to do with photonvisions dashboard settings - and I was right. We discovered the following: in order to use Transform3d to calculate X and Y, you must make sure the processing resolution is high enough (1080x1920) and you must be on the correct processing tab. That means to use the 3d tab, not the 2d tab. It seems that changing the settings on the dashboard actually change how the pi is processing data, so if you are on 2d, you cannot receive any 3d data. Occasionally when you try to switch pipelines or types of things to detect and they don't switch, just reset photon vision. Also the pi gets hot, but it doesn't seem to slow much down??? can't say for sure, did have horrible buffering with dashboard when i reset though.","shortMessageHtmlLink":"Consistent: Detect an april tag and use On-The-Fly path planning to m…"}},{"before":null,"after":"3dc8f794c480a29862527b7ed55f44238dd08b52","ref":"refs/heads/Swerve2-Adaptation-OnTheFlyPath-Robin","pushedAt":"2024-05-19T03:31:32.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"kitzoyan","name":null,"path":"/kitzoyan","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/110635359?s=80&v=4"},"commit":{"message":"On the fly path working as instant command, pre-planned path tested\n\nPress Y to travel the robot from its current position +1 meters in the x direction and +1 meters in the y direction (forward, left). Look at my comments to see explanation of On The Fly Paths.","shortMessageHtmlLink":"On the fly path working as instant command, pre-planned path tested"}},{"before":"e497f4070297d15c865611b63edf0d9c072cd285","after":"43e3a6227f656ed75263714e2a790c96071c0600","ref":"refs/heads/Swerve2-Adaptation","pushedAt":"2024-05-19T01:34:22.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"SeanL1234","name":null,"path":"/SeanL1234","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/158243448?s=80&v=4"},"commit":{"message":"added path planner and on fly paths","shortMessageHtmlLink":"added path planner and on fly paths"}}],"hasNextPage":true,"hasPreviousPage":false,"activityType":"all","actor":null,"timePeriod":"all","sort":"DESC","perPage":30,"cursor":"djE6ks8AAAAEo2H7aAA","startCursor":null,"endCursor":null}},"title":"Activity · Team7520/FRC2024"}