Hello all,
I'm having some issues getting my server application to work with TCPnet. I've gotten a few of the examples working with TCP and BSD sockets, but I must be missing something moving forward.
Here is a basic rundown of what I'm trying to accomplish right now: The TCP Server resides on our embedded device (the cpu is an LPC1788) and waits for a client connection. When the client (a windows forms application on a PC) connects to the server, the server begins periodically sending some measurement data and responding to commands from the client application. Presently, I'm trying to get a proof of concept working where my server simply echo's back the command I sent it, while periodically (about every 100ms) sends a simple two byte message.
My problem is this: My client attempts to connect and the server accepts the connection. Upon establishing a connection, I send a command packet from the client. The server echo's the client command, however, whenever the server attempts to send the "streaming" two byte message, the
send(socket_bsd_con, (char *)&sbuf, 2,0);
command always returns
SCK_EWOULDBLOCK
. I've tried using TCP sockets, BSD sockets, and a number of secnarios (streaming without echoing, echoing without streaming, playing with the timing) and it always seems as though the server side socket can only send a packet if has just received one.
Also, I'm attempting to use non-blocking sockets, and I am not presently using RTX or an RTOS. Rather, I'm using a simpler SysTick interrupt based scheduler.
Here are snippets of the code I'm currently trying to get working:
This is where I initialize the server socket for handling connections:
//initialize the BSD socket and start listening socket_bsd_server = socket (AF_INET, SOCK_STREAM, 0); sck_mode=1;//indicates non-blocking mode for the socket ioctlres = ioctlsocket(socket_bsd_server,FIONBIO,&sck_mode); //pass control parameters addr.sin_port = htons(SMC_PORT); addr.sin_family = PF_INET; addr.sin_addr.s_addr = INADDR_ANY; bres=bind (socket_bsd_server, (SOCKADDR *)&addr, sizeof(addr)); //bind the socket to the addr lres=listen (socket_bsd_server, 1); //listen for connection requests with a backlog of 1 MsTimerReset(5);//reset one of my "utility" timers
And then after initializing I call this code each 1ms tick in the scheduler:
void TCP_BsdServerTask (void) { char dbuf[8]; //command buffer char sbuf[8]; //the "streaming" data bufer uint8_t buf_idx=0; if(listen_flag) { socket_bsd_con = accept (socket_bsd_server, NULL, NULL); if(socket_bsd_con>0) //if we got a valid socket handle { closesocket (socket_bsd_server); //close the server socket to free up resources listen_flag=FALSE; //we accepted a connecttion, don't need to listen anymores } if(socket_bsd_con ==SCK_EWOULDBLOCK) { //accept doesn't have a connection in queue, we have to try back later return; } }//if listen flag res = recv (socket_bsd_con, dbuf, sizeof (dbuf), 0); //try to grab data //handle cmd data if(res>0) //if there is data available and we're connected { listen_flag=FALSE; if (socket_bsd_con > 0) //if a valid socket handle exists to send with { send(socket_bsd_con, (uint8_t *)&dbuf, sizeof(dbuf),0); } } if(MsTimerGet(5)>100)//if 100ms has elapsed { GCB_TIMER_MsTimerReset(5); if (socket_bsd_con > 0) //if a valid socket handle exists to send with { sbuf[0]=0xAA; sbuf[1]=0xCC; success=send(socket_bsd_con, (char *)&sbuf, 2,0); //<--Problem Here /* ^^^^ ALWAYS RETURNS SCK_EWOULDBLOCK ^^^^ */ } }//if Ms timer }//server task Sorry if it's a bit messy, I've been trying a bunch of different things to get this to work. Either I'm making some fundamental mistake in how I'm using these sockets, or perhaps I have some issue with the way memory is being allocated for sending date. I can't seem to figure out what I'm doing wrong. Thanks in advance to anyone that might be able to help me accomplish my goal here.