iOS小米遥控器的手势监听及UI实现
这篇文章通过实例实现了一个类似小米手势遥控器的功能页面。
效果图如下所示:
触摸事件的响应通过对系统的触摸实践监听来进行。
通过一个数组来对点的集合进行缓存和分析。
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { if (!self.allowsInteraction) return; UITouch *touch = [touches anyObject]; CGPoint start = [touch locationInView:self.view]; [_gestureManager beginMonitorWithPoint:start]; [self showLightAtPoint:start]; NSLog(@"touch begin"); } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { if (!self.allowsInteraction) return; UITouch *touch = [touches anyObject]; CGPoint point = [touch locationInView:self.view]; __weak typeof(&*self) weakSelf = self; [_gestureManager updateMonitorWithPoint:point action:^{ [weakSelf showLightAtPoint:point]; }]; }
在触摸开始和移动的时候,通过一个类来对手势相关方法的触发和管理及其他行为。即成员_gestureManager。
- (void)beginMonitorWithPoint:(CGPoint)point { [self addPoint:point]; } - (void)updateMonitorWithPoint:(CGPoint)point action:(dispatch_block_t)actionBlock { _curTime++; int delta = (int)(_curTime - _lastSpawnTime); if (delta >= TIME_GAP) { if (actionBlock) { actionBlock(); } _lastSpawnTime = _curTime; [self addPoint:point]; } }
在开始监听后,我们不需要把系统传递的每个点都进行触发贴图显示点的轨迹,所以设置了成员来设置间隙位,已达到对点的密集程度进行控制。
- (void)endMonitor { _curTime = 0; _lastSpawnTime = 0; [self pathAnalysis]; [self.pointPath removeAllObjects]; }
下面则开始进行分析手势,分析的思路比较简单。
计算起始点和终点之间的差值,对x,y进行分析,判断方向,再判断有无突出(是否是返回,功能等手势)
- (void)pathAnalysis { int count = self.pointPath.count; NSLog(@"points count: %d", count); if (count > JUDGE_CONTAIN) { goto SendNone; } else if (count == 1) { [self sendDelegateResult:MonitorResultTypeChosen]; } else { CGPoint start = valueToPoint([self.pointPath firstObject]); CGPoint end = valueToPoint([self.pointPath lastObject]); int deltaX = pSub(start, end).x; int deltaY = pSub(start, end).y; int midIndex = count/2; CGPoint mid = valueToPoint(self.pointPath[midIndex]); if (abs(deltaX) > JUDGE_X && abs(deltaY) < JUDGE_Y) { // horizontal direction if (deltaX < 0) { //right direction if (![self checkIsAlwaysCorrectDirection:MonitorResultTypeRight start:0 end:self.pointPath.count-1]) goto SendNone; if (pSub(start, mid).y > JUDGE_Y/2) { if ([self checkTrackIsMenu]) [self sendDelegateResult:MonitorResultTypeMenu]; else goto SendNone; } else if (abs(pSub(start, mid).y) < JUDGE_Y) { [self sendDelegateResult:MonitorResultTypeRight]; } else goto SendNone; } else { //left if (![self checkIsAlwaysCorrectDirection:MonitorResultTypeLeft start:0 end:self.pointPath.count-1]) goto SendNone; if (pSub(start, mid).y > JUDGE_Y/2) { if ([self checkTrackIsMenu]) { [self sendDelegateResult:MonitorResultTypeMenu]; } else goto SendNone; } else if (abs(pSub(start, mid).y) < JUDGE_Y) { [self sendDelegateResult:MonitorResultTypeLeft]; } else goto SendNone; } } else if (abs(deltaX) < JUDGE_X && abs(deltaY) > JUDGE_Y) { // vertical direction if (deltaY < 0) { // down if (![self checkIsAlwaysCorrectDirection:MonitorResultTypeDownwards start:0 end:self.pointPath.count-1]) goto SendNone; if (pSub(start, mid).x > JUDGE_X/2) { if ([self checkTrackIsBack]) [self sendDelegateResult:MonitorResultTypeBack]; else goto SendNone; } else if (abs(pSub(start, mid).x) < JUDGE_X) { [self sendDelegateResult:MonitorResultTypeDownwards]; } else goto SendNone; } else { // up if (![self checkIsAlwaysCorrectDirection:MonitorResultTypeUpwards start:0 end:self.pointPath.count-1]) goto SendNone; if (abs(pSub(start, mid).x) < JUDGE_X) [self sendDelegateResult:MonitorResultTypeUpwards]; else goto SendNone; } } else goto SendNone; } return; SendNone: [self sendDelegateResult:MonitorResultTypeNone]; return; }
还有一些需要用到的函数
UIKIT_STATIC_INLINE UIImageView * quickImageView(NSString * imgName) { UIImageView *iv = [[UIImageView alloc] initWithImage:ImageCache(imgName)]; return iv; } UIKIT_STATIC_INLINE CGPoint pSub(CGPoint a, CGPoint b) { return CGPointMake(a.x - b.x, a.y - b.y); } UIKIT_STATIC_INLINE NSValue * pointToValue(CGPoint a) { return [NSValue valueWithCGPoint:a]; } UIKIT_STATIC_INLINE CGPoint valueToPoint(NSValue *v) { return [v CGPointValue]; }
因为这些函数调用频率都比较高,所以就声明为内联静态了。
那些检验是否一个方向,或者是否突出的方法则如下所示:
- (BOOL)checkIsAlwaysCorrectDirection:(MonitorResultType)direct start:(int)start end:(int)end { PathLogicBlock block; switch (direct) { case MonitorResultTypeRight: { block = ^(CGPoint v) { BOOL ret = (v.x >= 0)? NO: YES; return ret; }; } break; case MonitorResultTypeLeft: { block = ^(CGPoint v) { BOOL ret = (v.x <= 0)? NO: YES; return ret; }; } break; case MonitorResultTypeUpwards: { block = ^(CGPoint v) { BOOL ret = (v.y <= 0)? NO: YES; return ret; }; } break; case MonitorResultTypeDownwards: { block = ^(CGPoint v) { BOOL ret = (v.y >= 0)? NO: YES; return ret; }; } break; default: {return NO;} break; } for (int i = start; i+POINT_GAP < end; i += POINT_GAP) { CGPoint s = valueToPoint(self.pointPath[i]); CGPoint e = valueToPoint(self.pointPath[i+POINT_GAP]); CGPoint d = pSub(s, e); if (!block(d)) {return NO;} } return YES; }
这里通过block设置条件,然后在遍历中进行检查及返回BOOL值。
其他也多用遍历进行判断,大部分分析都在一次遍历以内。例如检查是不是弹出菜单手势或者返回手势。
- (BOOL)checkTrackIsMenu { int start = 0; int end = self.pointPath.count-1; BOOL flag = NO; while (valueToPoint(self.pointPath[start]).y >= valueToPoint(self.pointPath[start+1]).y) {start++;} while (valueToPoint(self.pointPath[end]).y >= valueToPoint(self.pointPath[end-1]).y) {end--;} if (abs(start-end) < 2*POINT_GAP) { flag = YES; } return flag; } - (BOOL)checkTrackIsBack { int start = 0; int end = self.pointPath.count-1; BOOL flag = NO; while (valueToPoint(self.pointPath[start]).x >= valueToPoint(self.pointPath[start+1]).x) {start++;} while (valueToPoint(self.pointPath[end]).x >= valueToPoint(self.pointPath[end-1]).x) {end--;} if (abs(start-end) < 2*POINT_GAP) { flag = YES; } return flag; }
在图片显示方面,我对要用到的图片在Controller加载之后进行了预加载。
- (void)loadGestureManager { _gestureManager = [MIGestureManager sharedManager]; _gestureManager.delegate = self; [_gestureManager preloadResources]; } //gesture manager method - (void)preloadResources { for (int i = 0; i < INITIAL_COUNT; i++) { UIImageView *iv = quickImageView(PointImage); [self.imageSet addObject:iv]; } _upImageView = quickImageView(UpwardsImage); _downImageView = quickImageView(DownwardsImage); _leftImageView = quickImageView(LeftImage); _rightImageView = quickImageView(RightImage); _homeImageView = quickImageView(HomeImage); _backImageView = quickImageView(BackImage); _menuImageView = quickImageView(MenuImage); _chosenImageView = quickImageView(chosenImages[0]); NSMutableArray *aniArr = [NSMutableArray array]; for (int i = 0; i < 4; i++) { UIImage *image = ImageCache(chosenImages[i]); [aniArr addObject:image]; } _chosenImageView.animationImages = aniArr; _chosenImageView.animationDuration = 0.7; _chosenImageView.animationRepeatCount = 1; }
视图层次问题:
我们看小米遥控器的效果是都在一个网格下面,这里就是在显示点轨迹的视图上覆盖了一层网格视图,以达到那样的效果。
源代码地址:Rannie/MIRemoteControl
当然,这个项目里也有很多要解决的问题,在项目Readme.md中也有提到:
1.点的集合通过系统自带的NSMutableArray来维护,由于不能存结构体,导致需要不停的封包拆包动作如下:
static inline NSValue * pointToValue(CGPoint a) {
return [NSValue valueWithCGPoint:a];
}
static inline CGPoint valueToPoint(NSValue *v) {
return [v CGPointValue];
}
可以通过自己实现数据结构来维护点顺序集合。
2.贴图使用的是UIImageView,可以通过轻量级一些的layer设置content来实现。
3.这里监听的是控制器中的touch事件,也可以通过子类化UIGestureRecognizer来监听UITouch,需要导入一个UIGestureRecognizer子类化的一个头文件即可监听touch事件。具体可以看Using UIGestureRecognizer
with Swift Tutorial
4.点的路径分析比较简单,如果对统计有研究会有更出色的分析公式。
以上就是本篇博客全部内容,欢迎指正和评论。
郑重声明:本站内容如果来自互联网及其他传播媒体,其版权均属原媒体及文章作者所有。转载目的在于传递更多信息及用于网络分享,并不代表本站赞同其观点和对其真实性负责,也不构成任何其他建议。